We study the minimization problem of a non-convex sparsity promoting penalty function, the transformed l_1 (TL1), and its application in compressed sensing (CS). The TL1 penalty interpolates l_1 and l_1 norms through a nonnegative parameter l_1 , similar to l_1 with l_1 , and is known to satisfy unbiasedness, sparsity and Lipschitz continuity properties. We first consider the constrained minimization problem and discuss the exact recovery of l_1 norm minimal solution based on the null space property (NSP). We then prove the stable recovery of l_1 norm minimal solution if the sensing matrix l_1 satisfies a restricted isometry property (RIP). Next, we present difference of convex algorithms for TL1 (DCATL1) in computing TL1-regularized constrained and unconstrained problems in CS. The inner loop concerns an l_1 minimization problem on which we employ the Alternating Direction Method of Multipliers (ADMM). For the unconstrained problem, we prove convergence of DCATL1 to a stationary point satisfying the first order optimality condition. In numerical experiments, we identify the optimal value l_1 , and compare DCATL1 with other CS algorithms on two classes of sensing matrices: Gaussian random matrices and over-sampled discrete cosine transform matrices (DCT). We find that for both classes of sensing matrices, the performance of DCATL1 algorithm (initiated with l_1 minimization) always ranks near the top (if not the top), and is the most robust choice insensitive to the conditioning of the sensing matrix l_1 . DCATL1 is also competitive in comparison with DCA on other non-convex penalty functions commonly used in statistics with two