a fast csv library written in pure Lua
Go to file
Shakil Thakur 705d3a589a Performance improvements and new release (#20)
* massive speedup by removing pcall

* updated readme with benchmarks and getting ready for another release
2018-05-30 22:32:28 -05:00
spec returned headers should now be correct, BOM fix for headerless files and added dynamic_features_spec (#19) 2018-05-29 19:31:28 -05:00
.travis.yml update travis to run lua 5.2/5.3 (#16) 2018-05-26 20:18:08 -05:00
ERRORS.md better error handling with unit tests! 2016-11-06 11:36:24 -06:00
ftcsv-1.1.6-1.rockspec Performance improvements and new release (#20) 2018-05-30 22:32:28 -05:00
ftcsv.lua Performance improvements and new release (#20) 2018-05-30 22:32:28 -05:00
LICENSE Initial commit 2016-03-09 06:38:25 -06:00
README.md Performance improvements and new release (#20) 2018-05-30 22:32:28 -05:00

ftcsv

Build Status Coverage Status

ftcsv is a fast pure lua csv library.

It works well for CSVs that can easily be fully loaded into memory (easily up to a hundred MB) and correctly handles \n (LF), \r (CR) and \r\n (CRLF) line endings. It has UTF-8 support, and will strip out the BOM if it exists. ftcsv can also parse headerless csv-like files and supports column remapping, file or string based loading, and more!

Currently, there isn't a "large" file mode with proper readers for ingesting large CSVs using a fixed amount of memory, but that is in the works in another branch!

It's been tested with LuaJIT 2.0/2.1 and Lua 5.1, 5.2, and 5.3

Installing

You can either grab ftcsv.lua from here or install via luarocks:

luarocks install ftcsv

Parsing

ftcsv.parse(fileName, delimiter [, options])

ftcsv will load the entire csv file into memory, then parse it in one go, returning a lua table with the parsed data and a lua table containing the column headers. It has only two required parameters - a file name and delimiter (limited to one character). A few optional parameters can be passed in via a table (examples below).

Just loading a csv file:

local ftcsv = require('ftcsv')
local zipcodes, headers = ftcsv.parse("free-zipcode-database.csv", ",")

Options

The following are optional parameters passed in via the third argument as a table. For example if you wanted to loadFromString and not use headers, you could use the following:

ftcsv.parse("apple,banana,carrot", ",", {loadFromString=true, headers=false})
  • loadFromString

    If you want to load a csv from a string instead of a file, set loadFromString to true (default: false)

    ftcsv.parse("a,b,c\r\n1,2,3", ",", {loadFromString=true})
    
  • rename

    If you want to rename a field, you can set rename to change the field names. The below example will change the headers from a,b,c to d,e,f

    Note: You can rename two fields to the same value, ftcsv will keep the field that appears latest in the line.

    local options = {loadFromString=true, rename={["a"] = "d", ["b"] = "e", ["c"] = "f"}}
    local actual = ftcsv.parse("a,b,c\r\napple,banana,carrot", ",", options)
    
  • fieldsToKeep

    If you only want to keep certain fields from the CSV, send them in as a table-list and it should parse a little faster and use less memory.

    Note: If you want to keep a renamed field, put the new name of the field in fieldsToKeep:

    local options = {loadFromString=true, fieldsToKeep={"a","f"}, rename={["c"] = "f"}}
    local actual = ftcsv.parse("a,b,c\r\napple,banana,carrot\r\n", ",", options)
    
  • headerFunc

    Applies a function to every field in the header. If you are using rename, the function is applied after the rename.

    Ex: making all fields uppercase

    local options = {loadFromString=true, headerFunc=string.upper}
    local actual = ftcsv.parse("a,b,c\napple,banana,carrot", ",", options)
    
  • headers

    Set headers to false if the file you are reading doesn't have any headers. This will cause ftcsv to create indexed tables rather than a key-value tables for the output.

    local options = {loadFromString=true, headers=false}
    local actual = ftcsv.parse("apple>banana>carrot\ndiamond>emerald>pearl", ">", options)
    

    Note: Header-less files can still use the rename option and after a field has been renamed, it can specified as a field to keep. The rename syntax changes a little bit:

    local options = {loadFromString=true, headers=false, rename={"a","b","c"}, fieldsToKeep={"a","b"}}
    local actual = ftcsv.parse("apple>banana>carrot\ndiamond>emerald>pearl", ">", options)
    

    In the above example, the first field becomes 'a', the second field becomes 'b' and so on.

For all tested examples, take a look in /spec/feature_spec.lua and /spec/dynamic_features_spec.lua

Encoding

ftcsv.encode(inputTable, delimiter[, options])

ftcsv can also take a lua table and turn it into a text string to be written to a file. It has two required parameters, an inputTable and a delimiter. You can use it to write out a file like this:

local fileOutput = ftcsv.encode(users, ",")
local file = assert(io.open("ALLUSERS.csv", "w"))
file:write(fileOutput)
file:close()

Options

  • fieldsToKeep

    if fieldsToKeep is set in the encode process, only the fields specified will be written out to a file.

    local output = ftcsv.encode(everyUser, ",", {fieldsToKeep={"Name", "Phone", "City"}})
    

Error Handling

ftcsv returns a bunch of errors when passed a bad csv file or incorrect parameters. You can find a more detailed explanation of the more cryptic errors in ERRORS.md

Benchmarks

We ran ftcsv against a few different csv parsers (PIL/csvutils, lua_csv, and lpeg_josh) for lua and here is what we found:

20 MB file, every field is double quoted (ftcsv optimal lua case*)

Parser Lua LuaJIT
PIL/csvutils 3.939 +/- 0.565 SD 1.429 +/- 0.175 SD
lua_csv 8.487 +/- 0.156 SD 3.095 +/- 0.206 SD
lpeg_josh 1.350 +/- 0.191 SD 0.826 +/- 0.176 SD
ftcsv 3.101 +/- 0.152 SD 0.499 +/- 0.133 SD

* see Performance section below for an explanation

12 MB file, some fields are double quoted

Parser Lua LuaJIT
PIL/csvutils 2.868 +/- 0.101 SD 1.244 +/- 0.129 SD
lua_csv 7.773 +/- 0.083 SD 3.495 +/- 0.172 SD
lpeg_josh 1.146 +/- 0.191 SD 0.564 +/- 0.121 SD
ftcsv 3.401 +/- 0.109 SD 0.441 +/- 0.124 SD

LuaCSV was also tried, but usually errored out at odd places during parsing.

NOTE: times are measured using os.clock(), so they are in CPU seconds. Each test was run 30 times in a randomized order. The file was pre-loaded, and only the csv decoding time was measured.

Benchmarks were run under ftcsv 1.1.6

Performance

We did some basic testing and found that in lua, if you want to iterate over a string character-by-character and look for single chars, string.byte performs faster than string.sub. This is especially true for LuaJIT. As such, in LuaJIT, ftcsv iterates over the whole file and does byte compares to find quotes and delimiters. However, for pure lua, string.find is used to find quotes but string.byte is used everywhere else as the CSV format in its proper form will have quotes around fields. If you have thoughts on how to improve performance (either big picture or specifically within the code), create a GitHub issue - I'd love to hear about it!

Contributing

Feel free to create a new issue for any bugs you've found or help you need. If you want to contribute back to the project please do the following:

  1. If it's a major change (aka more than a quick bugfix), please create an issue so we can discuss it!
  2. Fork the repo
  3. Create a new branch
  4. Push your changes to the branch
  5. Run the test suite and make sure it still works
  6. Submit a pull request
  7. Wait for review
  8. Enjoy the changes made!

Licenses

  • The main library is licensed under the MIT License. Feel free to use it!
  • Some of the test CSVs are from csv-spectrum (BSD-2-Clause) which includes some from csvkit (MIT License)