In this paper, we first study \ell_q minimization and its associated iterative reweighted algorithm for recovering sparse vectors. Unlike most existing work, we focus on <i>unconstrained</i> \ell_q minimization, for which we show a few advantages on noisy measurements and/or approximately sparse vectors. Inspired by the results in [Daubechies et al., <i>Comm. Pure Appl. Math.</i>, 63 (2010), pp. 1--38] for <i>constrained</i> \ell_q minimization, we start with a preliminary yet novel analysis for <i>unconstrained</i> \ell_q minimization, which includes convergence, error bound, and local convergence behavior. Then, the algorithm and analysis are extended to the recovery of low-rank matrices. The algorithms for both vector and matrix recovery have been compared to some state-of-the-art algorithms and show superior performance on recovering sparse vectors and low-rank matrices.