Use mmap module for memory-mapped file I/O. Treats file as memory buffer. Good for large files, shared memory. Consider platform differences, file size limitations. Handle proper cleanup.
Use os.path or pathlib for path operations. Handle permissions, existence checks. Consider race conditions, atomic operations. Implement proper error handling. Use shutil for high-level operations.
Use pathlib.Path for object-oriented path operations. Handle platform differences, relative paths. Consider path normalization, validation. Handle special characters, spaces.
read() reads entire file into string, readline() reads single line, readlines() reads all lines into list. read() can specify size in bytes. Consider memory usage for large files. Use iteration for efficient line reading.
Use watchdog library or platform-specific APIs. Handle file system events. Consider polling vs event-based. Implement proper cleanup. Handle recursive monitoring.
Use mimetypes module, file signatures. Handle binary vs text files. Consider encoding detection. Implement proper validation. Handle unknown file types.
Common modes: 'r' (read), 'w' (write), 'a' (append), 'b' (binary), '+' (read/write). Can combine modes: 'rb' (read binary), 'w+' (read/write). Default is text mode, 'b' for binary. Consider encoding parameter for text files.
Use csv module: reader, writer, DictReader, DictWriter classes. Handle delimiters, quoting, escaping. Consider newline='' parameter when opening. Example: csv.DictReader for named columns. Handle different dialects.
Use 'b' mode flag, read/write bytes objects. Methods: read(size), write(bytes). Consider bytearray for mutable binary data. Handle endianness, struct module for binary structures. Buffer protocols.
Specify encoding parameter in open(): UTF-8, ASCII, etc. Handle encoding/decoding errors with errors parameter. Use codecs module for advanced encoding. Consider BOM, default system encoding.
Use fcntl module (Unix) or msvcrt (Windows). Implement advisory locking. Handle shared vs exclusive locks. Consider timeout, deadlock prevention. Example: with FileLock(filename):.
Use tempfile module: NamedTemporaryFile, TemporaryDirectory. Auto-cleanup when closed. Secure creation, unique names. Consider cleanup on program exit. Handle permissions properly.
Use generators for line iteration, chunk reading. Consider memory usage, buffering. Use mmap for random access. Implement progress tracking. Handle cleanup properly. Consider parallel processing.
Use openpyxl, pandas for XLSX files. xlrd/xlwt for older formats. Handle sheets, formatting, formulas. Consider memory usage for large files. Implement proper cleanup.
Use os.replace for atomic writes. Implement write-to-temp-then-rename pattern. Handle concurrent access. Consider file system limitations. Implement proper error recovery.
Use 'with' statement: 'with open(filename) as f:'. Ensures proper file closure even if exceptions occur. Best practice over manual open/close. Can handle multiple files: 'with open(f1) as a, open(f2) as b:'.
Buffering options: 0 (unbuffered), 1 (line buffered), >1 (size in bytes). Default is system dependent. Set with buffering parameter. Consider performance implications, memory usage. Flush buffer manually with flush().
Objects implementing file interface (read, write, etc.). Examples: StringIO, BytesIO for in-memory files. Used for compatibility with file operations. Consider context manager implementation.
Use configparser for INI files, yaml for YAML. Handle defaults, validation. Consider environment overrides. Implement config reloading. Handle sensitive data properly.
Use gzip, zipfile, tarfile modules. Handle compression levels, passwords. Consider streaming for large files. Implement progress tracking. Handle multiple file archives.
Use glob, fnmatch for patterns. re module for regex. Consider recursive search, filters. Handle large directories efficiently. Implement proper error handling.