My Docker-based AI/ML Training Environment
Dive into Docker, JupyterLab, and machine learning with 'learning-python'. Features Python 3.9, NodeJS, and PostgreSQL with pgvector. Start with make start at localhost:8888.
Without further ado, here is the link to the GitHub project I've created to stretch my muscles into Machine Learning, Artificial Intelligence, and Python:
Run that on your computer, or spin it up as a GitHub Space. It works both ways.
You get:
- A JupyterLAB server with its web interface on
localhost:8888
running Python 3.9 and NodeJS 20.x kernels - A Postgres DB with pgvector already installed
- A Docker Compose project that spins up containers
- A Makefile interface to operate the project
You need:
- Docker and Docker Compose
Here are a few high-level commands to operate with the environment:
# Start the services
make start
# Stop the services
make stop
👉 Once the environment start, open your browser to http://localhost:8888
Managing Dependencies
As this thing works inside a Docker container, you will need to modify the relative Dockerfile in order to install dependencies for Python or NodeJS.
- Open .jupiter/Dockerfile with your favourite editor.
- Search for "Python Dependencies" or "Node Dependencies"
- Add what you need
- run the following command:
# Install the dependencies and restart the environment
make reset