VW to sell electric cars in U.S. in 2015, report says

German carmaker Volkswagen plans to start selling electric cars in the United States in 2015, the New York Times reported, citing a Volkswagen official.