Fix Python – _csv.Error: field larger than field limit (131072)


Asked By – user1251007

I have a script reading in a csv file with very huge fields:

# example from
import csv
with open('some.csv', newline='') as f:
    reader = csv.reader(f)
    for row in reader:

However, this throws the following error on some csv files:

_csv.Error: field larger than field limit (131072)

How can I analyze csv files with huge fields? Skipping the lines with huge fields is not an option as the data needs to be analyzed in subsequent steps.

Now we will see solution for issue: _csv.Error: field larger than field limit (131072)


The csv file might contain very huge fields, therefore increase the field_size_limit:

import sys
import csv


sys.maxsize works for Python 2.x and 3.x. sys.maxint would only work with Python 2.x (SO: what-is-sys-maxint-in-python-3)


As Geoff pointed out, the code above might result in the following error: OverflowError: Python int too large to convert to C long.
To circumvent this, you could use the following quick and dirty code (which should work on every system with Python 2 and Python 3):

import sys
import csv
maxInt = sys.maxsize

while True:
    # decrease the maxInt value by factor 10 
    # as long as the OverflowError occurs.

    except OverflowError:
        maxInt = int(maxInt/10)

This question is answered By – user1251007

This answer is collected from stackoverflow and reviewed by FixPython community admins, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0