From 0310f951c936f4a5c1464c8119c30fddc4a32e89 Mon Sep 17 00:00:00 2001 From: Jack Del Vecchio Date: Tue, 3 Oct 2023 16:53:29 +0000 Subject: [PATCH] Update README file. --- plugins/parquet/README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/plugins/parquet/README.md b/plugins/parquet/README.md index d3b3802d195..241d89d7a51 100644 --- a/plugins/parquet/README.md +++ b/plugins/parquet/README.md @@ -33,7 +33,7 @@ The Parquet Plugin offers the following main functions: The Read function allows ECL programmers to create an ECL dataset from both regular and partitioned Parquet files. It leverages the Apache Arrow interface for Parquet to efficiently stream data from ECL to the plugin, ensuring optimized data transfer. ``` -dataset := Read(layout, '/source/directory/data.parquet'); +dataset := ParquetIO.Read(layout, '/source/directory/data.parquet'); ``` #### 2. Writing Parquet Files @@ -41,7 +41,7 @@ dataset := Read(layout, '/source/directory/data.parquet'); The Write function empowers ECL programmers to write ECL datasets to Parquet files. By leveraging the Parquet format's columnar storage capabilities, this function provides efficient compression and optimized storage for data. ``` -Write(inDataset, '/output/directory/data.parquet'); +ParquetIO.Write(inDataset, '/output/directory/data.parquet'); ``` ### Partitioned Files (Tabular Datasets) @@ -51,7 +51,7 @@ Write(inDataset, '/output/directory/data.parquet'); The Read Partition function extends the Read functionality by enabling ECL programmers to read from partitioned Parquet files. ``` -github_dataset := ReadPartition(layout, '/source/directory/partioned_dataset'); +github_dataset := ParquetIO.ReadPartition(layout, '/source/directory/partioned_dataset'); ``` #### 2. Writing Partitioned Files