During migration from PySpark to Spark with Scala I encountered a problem caused by the fact that SqlContext's registerDataFrameAsTable method is private. It made me think that my approach might be incorrect. In PySpark I do the following: load each table:
df = sqlContext.load(source, url, dbtable), then register each
sqlContext.registerDataFrameAsTable(df, dbtable), finally using
sqlContext.sql method I can do my queries (which is basically what I need).
Is it right way to do it? How can I achieve it in Scala?