Senior App Dev @ Acuity Brands Lighting | Co-Founder of https://ct3dao.io | President of https://NewHaven.IO | Maintainer of https://TechEnthusiastScholarship.com | https://HenryGives.Coffee
Location
New Haven, CT
Education
Computer Network & Information Security @ Champlain College
I read about it online for a while first. Everyone was saying that it is better to have the database as a standalone image. So I got myself an Ubuntu MySQL dockerfile off the internet to create that DB image.
Now I have 2 images, one for the NodeJS server and another for MySql DB.
But, I can't connect them both. I even tried accessing the database using phpmyadmin from outside the docker container and failed.
In my docker-compose, I created a network, set both images on the same network. and added the links: - db:db to make the backend server depend on the database.
I know that the network is fine. because I have a third frontend server image and it can communicate with the backend successfully.
Senior App Dev @ Acuity Brands Lighting | Co-Founder of https://ct3dao.io | President of https://NewHaven.IO | Maintainer of https://TechEnthusiastScholarship.com | https://HenryGives.Coffee
Location
New Haven, CT
Education
Computer Network & Information Security @ Champlain College
For the PHPMyAdmin bit, are you exposing a port on the DB image? If you put two images on a Docker network like that, they just talk to everything on that network. You have to explicitly expose a port to have that container talk to anything "local" not on that network.
Senior App Dev @ Acuity Brands Lighting | Co-Founder of https://ct3dao.io | President of https://NewHaven.IO | Maintainer of https://TechEnthusiastScholarship.com | https://HenryGives.Coffee
Location
New Haven, CT
Education
Computer Network & Information Security @ Champlain College
Senior App Dev @ Acuity Brands Lighting | Co-Founder of https://ct3dao.io | President of https://NewHaven.IO | Maintainer of https://TechEnthusiastScholarship.com | https://HenryGives.Coffee
Location
New Haven, CT
Education
Computer Network & Information Security @ Champlain College
Senior App Dev @ Acuity Brands Lighting | Co-Founder of https://ct3dao.io | President of https://NewHaven.IO | Maintainer of https://TechEnthusiastScholarship.com | https://HenryGives.Coffee
Location
New Haven, CT
Education
Computer Network & Information Security @ Champlain College
Senior App Dev @ Acuity Brands Lighting | Co-Founder of https://ct3dao.io | President of https://NewHaven.IO | Maintainer of https://TechEnthusiastScholarship.com | https://HenryGives.Coffee
Location
New Haven, CT
Education
Computer Network & Information Security @ Champlain College
I'd consider updating your docker-compose.yaml to version 3.x and use the container_name item to force a name on your container.
Then, in your ENV file, use whatever you set that container_name to DB_HOST. I understand why you're trying to use localhost, but if all of these containers are on one Docker network, they'll want to refer to each other by their container names.
Senior App Dev @ Acuity Brands Lighting | Co-Founder of https://ct3dao.io | President of https://NewHaven.IO | Maintainer of https://TechEnthusiastScholarship.com | https://HenryGives.Coffee
Location
New Haven, CT
Education
Computer Network & Information Security @ Champlain College
What exactly are you having problems with, @Bassem?
I read about it online for a while first. Everyone was saying that it is better to have the database as a standalone image. So I got myself an Ubuntu MySQL dockerfile off the internet to create that DB image.
Now I have 2 images, one for the NodeJS server and another for MySql DB.
But, I can't connect them both. I even tried accessing the database using phpmyadmin from outside the docker container and failed.
In my docker-compose, I created a network, set both images on the same network. and added the links: - db:db to make the backend server depend on the database.
I know that the network is fine. because I have a third frontend server image and it can communicate with the backend successfully.
For the PHPMyAdmin bit, are you exposing a port on the DB image? If you put two images on a Docker network like that, they just talk to everything on that network. You have to explicitly expose a port to have that container talk to anything "local" not on that network.
For the Node -> DB bit, do you mind sharing your docker-compose.yaml and the connection string you're using in your Node container?
Yeah sure, No problem!
I think that I am exposing ports using the "3306:3306". and for the backend config, I have a .env file
Actually, I have no idea what the username/password for the DB should be. But that's not the issue anyway according to the logs.
You're correct that you're exposing port 3306.
Your ENV file looks good, but what about the line of code that's using those ENV variables?
I am using a package called Sequelize for connecting to the DB and here is how :
Running
docker-compose up
produces this error :Unhandled rejection SequelizeConnectionRefusedError: connect ECONNREFUSED 127.0.0.1:3306
Could it be that the DB is not yet up and running?
I mean, could be. Have you tried using Kitematic or
docker exec -it <container name> /bin/bash
?Also, what is your output for
docker ps -a
when you have your compose file spun up?Edit: I see a couple things that could be at fault here, but I want to see the answer to those two questions before I take a stab at it. ^
I'd consider updating your docker-compose.yaml to version 3.x and use the container_name item to force a name on your container.
Then, in your ENV file, use whatever you set that container_name to DB_HOST. I understand why you're trying to use localhost, but if all of these containers are on one Docker network, they'll want to refer to each other by their container names.
I feel like i need to have a better understanding of docker.
Definitely will try doing that and will update you.
We all start somewhere, my dude. :)