天天看点

pyspark to mysql_使用pySpark将DataFrame写入mysql表

pyspark to mysql_使用pySpark将DataFrame写入mysql表

I am attempting to insert records into a MySql table. The table contains id and name as columns.

I am doing like below in a pyspark shell.

name = 'tester_1'

id = '103'

import pandas as pd

l = [id,name]

df = pd.DataFrame([l])

df.write.format('jdbc').options(

url='jdbc:mysql://localhost/database_name',

driver='com.mysql.jdbc.Driver',

dbtable='DestinationTableName',

user='your_user_name',

password='your_password').mode('append').save()

I am getting the below attribute error

AttributeError: 'DataFrame' object has no attribute 'write'

What am I doing wrong? What is the correct method to insert records into a MySql table from pySpark

解决方案Use Spark DataFrame instead of pandas', as .write is available on Spark Dataframe only

So the final code could be

data =['103', 'tester_1']

df = sc.parallelize(data).toDF(['id', 'name'])

df.write.format('jdbc').options(

url='jdbc:mysql://localhost/database_name',

driver='com.mysql.jdbc.Driver',

dbtable='DestinationTableName',

user='your_user_name',

password='your_password').mode('append').save()