Error while reading multiline Json

0 votes

Hi, 

I am getting below error while reading json data.

scala> val peopleDF = spark.read.option(multiline,true).json(/user/edureka_311477/sampjson.json);
peopleDF: org.apache.spark.sql.DataFrame = [_corrupt_record: string]
May 23, 2019 in Apache Spark by Ritu
2,819 views

1 answer to this question.

0 votes

peopleDF: org.apache.spark.sql.DataFrame = [_corrupt_record: string]

The above that you are getting is not an error message. Instead, it is just an info message stating that your dataframe has been created but the json format is wrong. The json api of sqlContext is reading it as a corrupt record.

Please refer to the below screenshot:

image

answered May 23, 2019 by Conny

Related Questions In Apache Spark

0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,490 points
2,800 views
+1 vote
1 answer

getting null values in spark dataframe while reading data from hbase

Can you share the screenshots for the ...READ MORE

answered Jul 31, 2018 in Apache Spark by kurt_cobain
• 9,350 points
2,318 views
0 votes
1 answer

Error reading avro dataset in spark

For avro, you need to download and ...READ MORE

answered Feb 4, 2019 in Apache Spark by Omkar
• 69,220 points
2,248 views
+1 vote
0 answers

_spark_metadata/0 doesn't exist while Compacting batch 9 Structured streaming error

We have Streaming Application implemented using Spark ...READ MORE

May 31, 2019 in Apache Spark by AzimKangda
• 130 points
4,112 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,028 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,535 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,830 views
0 votes
1 answer

Error while using Spark SQL filter API

You have to use "===" instead of ...READ MORE

answered Feb 4, 2019 in Apache Spark by Omkar
• 69,220 points
801 views
0 votes
1 answer

Spark: Error while instantiating "org.apache.spark.sql.hive.HiveSessionState"

Seems like you have not started the ...READ MORE

answered Jul 25, 2019 in Apache Spark by Rohit
8,183 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP