org.apache.spark.sql
:: Experimental :: Used to convert a JVM object of type T to and from the internal Spark SQL representation.
T
Encoders are generally created automatically through implicits from a SQLContext.
SQLContext
import sqlContext.implicits._ val ds = Seq(1, 2, 3).toDS() // implicitly provided (sqlContext.implicits.newIntEncoder)
Encoders are specified by calling static methods on Encoders.
List<String> data = Arrays.asList("abc", "abc", "xyz"); Dataset<String> ds = context.createDataset(data, Encoders.STRING());
Encoders can be composed into tuples:
Encoder<Tuple2<Integer, String>> encoder2 = Encoders.tuple(Encoders.INT(), Encoders.STRING()); List<Tuple2<Integer, String>> data2 = Arrays.asList(new scala.Tuple2(1, "a"); Dataset<Tuple2<Integer, String>> ds2 = context.createDataset(data2, encoder2);
Or constructed from Java Beans:
Encoders.bean(MyClass.class);
1.6.0
A ClassTag that can be used to construct and Array to contain a collection of T.
Returns the schema of encoding this type of object as a Row.
:: Experimental :: Used to convert a JVM object of type
T
to and from the internal Spark SQL representation.Scala
Encoders are generally created automatically through implicits from a
SQLContext
.Java
Encoders are specified by calling static methods on Encoders.
Encoders can be composed into tuples:
Or constructed from Java Beans:
Encoders.bean(MyClass.class);
Implementation
1.6.0