About 33,800,000 results
Open links in new tab
  1. What are the pros and cons of the Apache Parquet format compared …

    Apr 24, 2016 · 30,36,2 Parquet files are most commonly compressed with the Snappy compression algorithm. Snappy compressed files are splittable and quick to inflate. Big data systems want to …

  2. Reading / Fixing a corrupt parquet file - Stack Overflow

    Sep 3, 2024 · Reading / Fixing a corrupt parquet file Asked 1 year, 4 months ago Modified 7 months ago Viewed 2k times

  3. Is it possible to read parquet files in chunks? - Stack Overflow

    Nov 29, 2019 · The Parquet format stores the data in chunks, but there isn't a documented way to read in it chunks like read_csv. Is there a way to read parquet files in chunks?

  4. How to read a Parquet file into Pandas DataFrame?

    How to read a modestly sized Parquet data-set into an in-memory Pandas DataFrame without setting up a cluster computing infrastructure such as Hadoop or Spark? This is only a moderate amount of data …

  5. Inspect Parquet from command line - Stack Overflow

    How do I inspect the content of a Parquet file from the command line? The only option I see now is $ hadoop fs -get my-path local-file $ parquet-tools head local-file | less I would like to avoid

  6. Python: save pandas data frame to parquet file - Stack Overflow

    Dec 9, 2016 · Is it possible to save a pandas data frame directly to a parquet file? If not, what would be the suggested process? The aim is to be able to send the parquet file to another team, which they …

  7. pandas df.to_parquet write to multiple smaller files

    Sep 6, 2020 · Is it possible to use Pandas' DataFrame.to_parquet functionality to split writing into multiple files of some approximate desired size? I have a very large DataFrame (100M x 100), and …

  8. How do I get schema / column names from parquet file?

    Nov 24, 2015 · Also, Cloudera (which supports and contributes heavily to Parquet) has a nice page with examples on usage of hangxie's parquet-tools. An example from that page for your use case: …

  9. Convert csv to parquet file using python - Stack Overflow

    May 30, 2018 · I am trying to convert a .csv file to a .parquet file. The csv file (Temp.csv) has the following format 1,Jon,Doe,Denver I am using the following python code to convert it into parquet …

  10. Spark parquet partitioning : Large number of files

    Jun 28, 2017 · I am trying to leverage spark partitioning. I was trying to do something like data.write.partitionBy ("key").parquet ("/location") The issue here each partition creates huge number …