1.步骤:
1.spark 连接并操作mysql
2.退出已有连接的spark:scala> :q
3.加载mysql的jar
1.连接Mysql:
[root@spark1 spark-2.4.5-bin-hadoop2.7]# spark-shell --jars /home/data/jars/mysql-connector-java-5.1.38.jar
Spark context Web UI available at http://spark1:4040
Spark context available as 'sc' (master = local[*], app id = local-1582001587565).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.5
/_/
Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_162)
Type in expressions to have them evaluated.
Type :help for more information.
2.在mysql端创建event数据库:
mysql> create database if not exists event;
Query OK, 1 row affected (0.00 sec)
mysql> use event;
Database changed
mysql>
3.连接数据库:
scala> import java.util.Propert
Properties PropertyPermission PropertyResourceBundle
//在属性中加入用户名、密码、驱动
scala> import java.util.Properties
import java.util.Properties
scala> val props=new Properties()
props: java.util.Properties = {}
scala> props.put("driver","com.mysql.jdbc.Driver")
res2: Object = null
scala> props.put("user","root")
res3: Object = null
scala> props.put("password","root")
res4: Object = null
4.写入mysql:
scala> userDF.write.mode("overwrite").jdbc("jdbc:mysql://flume:3306/event","users",props)
5.在myql端查询:
mysql> select * from users limit 10;