I'm building a multi-container App. Here is the overall view of the working directory:
MABSED/
|_ docker-compose.yml
|_ ...
|_ streamer/
| |_ Dockerfile
| |_ startStreaming.py
| |_ credentials.py
|_ orchestrator/
|_ Dockerfile
|_ requirements.txt
|_ tasks.py
|_ my_sched.py
|_ data/
| |_ streaming/
| |_ preprocessed/
| |_ results/
|_ detector/
|_ filter/
|_ lemmatizer/
My App has 4 different services: an ElastisSearch container, a dashboard, a Streamer which captures tweets from Twitter and an Orchestrator which performs a task and saves the results in ElasticSearch.
This question involves just two of the services, the Streamer and the Orchestrator. As I have said I want this two components to share the data, what applied to my App means that I want the Orchestrator to be able to access the tweets captured by the Streamer. Moreover, I want this data to be stored on my computer local directory MABSED/orchestrator/data/ and not only in the container, in case I need to access that information once I have stopped the process.
In other words, I need that when I do docker-compose up this two containers get the data stored in MABSED/orchestrator/data/ and add the corresponding files, so that when Streamer adds a new file to MABSED/orchestrator/data/streaming/ Orchestrator can notice this change and add a new file to MABSED/orchestrator/data/results/.
Also startStreaming.py which is the script which Streamer service run saves the data to this relative path output_directory = '../orchestrator/data/streaming', which works fine on local but I don't know if it will on the Docker container.
By this moment, my docker-compose.yml looks like this:
version: '2'
services:
dashboard:
build: demo-dashboard/
ports:
- "8080:8080"
environment:
- ES_ENDPOINT_EXTERNAL=http://localhost:9200
- http.cors.enabled=true
- http.cors.allow-origin=ES_ENDPOINT_EXTERNAL
- http.cors.allow-headers=Content-Type, Access-Control-Allow-Headers, Authorization, X-Requested-With
- http.cors.allow-credentials=true
volumes:
- ./demo-dashboard:/usr/src/app
networks:
- dashboard-network
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:6.7.0
environment:
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
- http.cors.enabled=true
- http.cors.allow-origin=http://localhost:8080
- http.cors.allow-headers=Content-Type, Access-Control-Allow-Headers, Authorization, X-Requested-With
- http.cors.allow-credentials=true
ulimits:
memlock:
soft: -1
hard: -1
nofile:
soft: 65536
hard: 65536
mem_limit: 1g
cap_add:
- IPC_LOCK
volumes:
- esdata1:/usr/share/elasticsearch/data
networks:
- dashboard-network
ports:
- 9200:9200
orchestrator:
image: orchestrator-mabsed
build: orchestrator/
environment:
ES_HOST: 'elasticsearch'
tty: true
volumes:
- ./orchestrator/data/:/usr/src/app/orchestrator/data
streamer:
image: streamer-mabsed
build: streamer/
tty: true
volumes:
- ./orchestrator/data/:/usr/src/orchestrator/data
volumes:
esdata1:
driver: local
networks:
dashboard-network:
driver: bridge
I think I would need to create a volume in order to achieve this, but I'm relatively new to Docker and I don't know how to manage this.
Here are my Streamer Dockerfile:
FROM python:3.6
RUN pip3 install --user tweepy
WORKDIR /usr/src/app/
COPY startStreaming.py /usr/src/app/
COPY credentials.py /usr/src/app/
CMD python startStreaming.py
and my Orchestrator Dockerfile:
FROM python:3.6
COPY . /usr/src/app/
WORKDIR /usr/src/app/
RUN pip3 install --user -r requirements.txt
CMD python my_sched.py
You can share the same local directory with your services.
Just make sure that your code refers to the directory accordingly (shared path).
In this case, /usr/src/app/orchestrator/data
Sample:-
orchestrator:
image: orchestrator-mabsed
build: orchestrator/
environment:
ES_HOST: 'elasticsearch'
tty: true
volumes:
- MABSED/orchestrator/data/:/usr/src/app/orchestrator/data
streamer:
image: streamer-mabsed
build: streamer/
tty: true
volumes:
- MABSED/orchestrator/data/:/usr/src/app/orchestrator/data
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With