↧
Answer by Peter Rose for How can I change column types in Spark SQL's DataFrame?
To convert the year from string to int, you can add the following option to the csv reader: "inferSchema" ->"true", see DataBricks documentation
View ArticleAnswer by dnlbrky for How can I change column types in Spark SQL's DataFrame?
You can use selectExpr to make it a little cleaner:df.selectExpr("cast(year as int) as year", "upper(make) as make","model", "comment", "blank")
View ArticleAnswer by Svend for How can I change column types in Spark SQL's DataFrame?
[EDIT: March 2016: thanks for the votes! Though really, this is not the best answer, I think the solutions based on withColumn, withColumnRenamed and cast put forward by msemelman, Martin Senne and...
View ArticleHow can I change column types in Spark SQL's DataFrame?
Suppose I'm doing something like:val df = sqlContext.load("com.databricks.spark.csv", Map("path" -> "cars.csv", "header" -> "true"))df.printSchema()root |-- year: string (nullable = true) |--...
View Article