So... it would seem that the code tries to slurp the whole file into memory in one go, and convert it to one big string. That's not a good idea....
Error: Cannot create a string longer than 0x1fffffe8 characters
at Buffer.toString (node:buffer:784:17)
at convert (C:\Users\chris\GITW\nsldist\client\node_modules\csv2geojson\csv2geojson:23:34)
at ConcatStream.<anonymous> (C:\Users\chris\GITW\nsldist\client\node_modules\concat-stream\index.js:36:43)
at ConcatStream.emit (node:events:402:35)
at finishMaybe (C:\Users\chris\GITW\nsldist\client\node_modules\readable-stream\lib\_stream_writable.js:475:14)
at endWritable (C:\Users\chris\GITW\nsldist\client\node_modules\readable-stream\lib\_stream_writable.js:485:3)
at ConcatStream.Writable.end (C:\Users\chris\GITW\nsldist\client\node_modules\readable-stream\lib\_stream_writable.js:455:41)
at ReadStream.onend (node:internal/streams/readable:693:10)
at Object.onceWrapper (node:events:509:28)
at ReadStream.emit (node:events:390:28) {
code: 'ERR_STRING_TOO_LONG'
}```
So... it would seem that the code tries to slurp the whole file into memory in one go, and convert it to one big string. That's not a good idea....