I have the following folder structure in python:
my_project/
├── Dockerfile
├── Makefile
├── run.py
├── data/
│ ├── raw/
│ └── processed/
└── src/
├── __init__.py
├── config.py
├── settings.env
└── response/
├── __init__.py
├── llm.py
├── instances.py
└── get_response.py
Context
I started with a simple project to get structured responses from an LLM using Python. As the project grew, I decided to make it more production-ready by adding proper structure and organization. However, I'm unsure if my current structure is optimal, particularly regarding import handling.
Current Issues
Initially, when all code was in one folder, imports were straightforward:
# When everything was in one folder
from llm import get_completion
from instances import MyClass
After restructuring and using run.py as the main entry point, I had to modify imports to work from the parent directory:
# In run.py
from src.response.llm import get_completion
from src.response.instances import MyClass
Specific Questions
Is this the correct way to structure a production-ready Python project? How should I handle imports when I want to:
Run tests from a separate test directory?
Execute files directly within their folders (e.g., for development/debugging)?
Use the if __name__ == '__main__': block with test code in individual files?
Do I need to modify import statements every time I run files from different locations? Is adding project root to Python path when running directly, the best possible option?
Technical Details
Python version: 3.11
Current behavior: Files only run correctly when executed from the parent directory
Desired behavior: Ability to run files, tests, and debug code from any location without constantly modifying imports (if possible)
Currently I cannot run my various scripts from their own location, I have to run them from the project root - how can I make them work from any location?