When we use f.readlines()
or f.readline()
to get file data, there might happen a problem. For example, the file is so large that our computer memory is not big enough to afford.
So, how to deal with large file by Python. Here is the solution: 👇
#!/usr/bin/env python
# -*- coding: utf-8 -*-
def file_reader(f, newline):
"""
Parameters:
- f: io.BufferReader
- newline: separator
Description:
A generator to deal with huge file.
"""
buf = ""
while True:
while newline in buf:
pos = buf.index(newline)
yield bug[:pos]
buf = buf[pos + len(newline):]
chunk = f.read(4096)
if not chunk:
yield buf
break
buf += chunk