DEV Community

Stack All Flow
Stack All Flow

Posted on • Originally published at stackallflow.com on

How to Run a Single Command at Startup Using Systemd in Ubuntu?

startupsystemd

I’d like to startup an Apache Spark cluster after boot using the following command:

sudo ./path/to/spark/sbin/start-all.sh

Enter fullscreen mode Exit fullscreen mode

Then run this command when the system prepares to reboot/shutdown:

sudo ./path/to/spark/sbin/stop-all.sh

Enter fullscreen mode Exit fullscreen mode

How can I get started? Is there a basic template I can build on?

I’ve tried to use an extremely simple (file: /lib/systemd/system/spark.service):

[Unit]
Description=Spark service

[Service]
ExecStart=sudo ./path/to/spark/sbin/start-all.sh

Enter fullscreen mode Exit fullscreen mode

Which doesn’t work.

Accepted Answer

Your .service file should look like this:

[Unit]
Description=Spark service

[Service]
ExecStart=/path/to/spark/sbin/start-all.sh

[Install]
WantedBy=multi-user.target

Enter fullscreen mode Exit fullscreen mode

Now, take a few more steps to enable and use the .service file:

  1. Place it in /etc/systemd/system folder with say a name of myfirst.service
  2. Make sure that your script executable with:
chmod u+x /path/to/spark/sbin/start-all.sh

Enter fullscreen mode Exit fullscreen mode
  1. Start it:
sudo systemctl start myfirst

Enter fullscreen mode Exit fullscreen mode
  1. Enable it to run at boot:
sudo systemctl enable myfirst

Enter fullscreen mode Exit fullscreen mode
  1. Stop it:
sudo systemctl stop myfirst

Enter fullscreen mode Exit fullscreen mode

Notes

  1. You don’t need to launch Spark with sudo in your service, as the default service user is already root.
  2. Look at the links below for more systemd options.

Moreover

Now what we have above is just rudimentary, here is a complete setup for spark:

[Unit]
Description=Apache Spark Master and Slave Servers
After=network.target
After=systemd-user-sessions.service
After=network-online.target

[Service]
User=spark
Type=forking
ExecStart=/opt/spark-1.6.1-bin-hadoop2.6/sbin/start-all.sh
ExecStop=/opt/spark-1.6.1-bin-hadoop2.6/sbin/stop-all.sh
TimeoutSec=30
Restart=on-failure
RestartSec=30
StartLimitInterval=350
StartLimitBurst=10

[Install]
WantedBy=multi-user.target

Enter fullscreen mode Exit fullscreen mode

To setup the service:

sudo systemctl start spark.service
sudo systemctl stop spark.service
sudo systemctl enable spark.service

Enter fullscreen mode Exit fullscreen mode

Further reading

Please read through the following links. Spark is a complex setup, so you should understand how it integrates with Ubuntu’s init service.

The post How to Run a Single Command at Startup Using Systemd in Ubuntu? appeared first on Stack All Flow.

Top comments (0)