Find & View Bash Script Logs on IBM Cloud Code Engine
Running a Bash script on IBM Cloud Code Engine but can't find its output? This guide shows you how to easily view logs from stdout and stderr using the CLI and UI.
Alex Miller
Cloud Solutions Architect specializing in serverless computing and DevOps automation on IBM Cloud.
You’ve embraced the power and simplicity of IBM Cloud Code Engine. You have a trusty Bash script—maybe for data processing, nightly backups, or a quick utility task. You containerize it, deploy it as a Code Engine Job, and hit “run.” It completes in seconds. Success! Or... was it? Where did all your carefully placed echo
statements go? How do you know what actually happened inside that container?
If you're asking, “Where did my echo go?”, you're in the right place. In a serverless environment like Code Engine, you don't just SSH into a machine to check /var/log
. Instead, Code Engine cleverly captures everything your script prints to standard output (stdout
) and standard error (stderr
) and makes those logs available to you.
This guide will walk you through exactly how to find and view those precious log outputs, turning your black-box script into a transparent, debuggable process.
The Setup: A Sample Script and Container
To see logging in action, we need something to run. Let’s create a simple Bash script that simulates a multi-step job. It will print informational messages, simulate work with sleep
, and even send a message to standard error.
Step 1: Create the Bash Script
Create a file named hello-job.sh
. This script is designed to show how different types of output are handled.
#!/bin/bash
echo "[INFO] Job starting at $(date)"
echo "[INFO] This is a standard output message."
sleep 3
echo "[DEBUG] Performing a critical task..."
# Simulate some work
for i in {1..3}; do
echo "[DEBUG] Processing item $i..."
sleep 2
done
# This line sends output specifically to standard error (stderr)
echo "[WARN] This is a warning message sent to stderr." >&2
sleep 2
echo "[INFO] Job finished at $(date)"
The key takeaway here is that any line produced by echo
goes to stdout
by default, and you can redirect to stderr
using >&2
. Code Engine captures both.
Step 2: Dockerize the Script
Code Engine runs containers, so we need to package our script into a Docker image. Create a file named Dockerfile
in the same directory.
# Use a small, efficient base image
FROM alpine:latest
# Set the working directory inside the container
WORKDIR /app
# Copy the script into the container and make it executable
COPY hello-job.sh .
RUN chmod +x hello-job.sh
# Define the command that will run when the container starts
CMD ["./hello-job.sh"]
Step 3: Build and Push the Image
Now, build the image and push it to a container registry. This could be IBM Cloud Container Registry, Docker Hub, or any other OCI-compliant registry. Replace <your-registry>
and <your-image-name>
with your details.
# Build the Docker image
docker build -t <your-registry>/<your-image-name>:1.0 .
# Push the image to the registry
docker push <your-registry>/<your-image-name>:1.0
Deploying and Running the Job
With our container image ready, let's create and run a Job in Code Engine. We'll use the IBM Cloud CLI, as it's fast and scriptable.
First, ensure you're logged in and have targeted your Code Engine project:
ibmcloud login
ibmcloud target -g <your-resource-group>
ibmcloud ce project select --name <your-project-name>
Step 4: Create the Code Engine Job
A Job in Code Engine is designed for tasks that run to completion. This is different from an App, which is for long-running services that respond to HTTP requests. Our script is a perfect fit for a Job.
ibmcloud ce job create --name my-bash-job --image <your-registry>/<your-image-name>:1.0
This command defines the job but doesn't run it yet.
Step 5: Run the Job
To execute the job, you submit a Job Run. This creates a new instance of your job's container and runs it.
ibmcloud ce jobrun submit --name my-bash-job-run-01 --job my-bash-job
The job will now run in the background. But where are the logs?
Finding Your Bash Script Logs
Here are the three primary ways to access your script's output, from quick debugging to production-grade monitoring.
Method 1: The Command Line (Your Best Friend)
The quickest way to see what your script did is with the CLI. Each job run has a unique name (we named ours my-bash-job-run-01
). You can fetch its logs directly.
Use the ibmcloud ce jobrun logs
command:
ibmcloud ce jobrun logs --name my-bash-job-run-01
The output will be the complete, combined stdout
and stderr
from your script, streamed directly to your terminal:
[INFO] Job starting at ...
[INFO] This is a standard output message.
[DEBUG] Performing a critical task...
[DEBUG] Processing item 1...
[DEBUG] Processing item 2...
[DEBUG] Processing item 3...
[WARN] This is a warning message sent to stderr.
[INFO] Job finished at ...
And there it is! Every echo
, right where you need it. This method is fantastic for immediate feedback and debugging during development.
Method 2: The Visual Route (IBM Cloud Console)
If you prefer a graphical interface, the IBM Cloud Console provides a clear, step-by-step way to find your logs.
- Navigate to the Code Engine projects page in the IBM Cloud Console.
- Select the project where your job is located.
- In the left-hand navigation pane, under Workloads, click Jobs.
- Click on the name of your job (e.g.,
my-bash-job
). - You'll land on the job's details page. Click the Job runs tab. This lists every execution of the job.
- Find and click on the specific run you want to inspect (e.g.,
my-bash-job-run-01
). - On the job run's details page, you will see a summary. The logs are right there, often visible by default or under a Logging or Logs tab/section.
The console presents the same log data as the CLI, providing a useful visual alternative for exploring past job runs.
Method 3: The Production-Ready Way (Log Analysis Integration)
For one-off debugging, the CLI and Console are perfect. But what about production jobs that run frequently? Or when you need to search logs from a week ago? Or set up alerts for errors?
This is where integrating with IBM Cloud Log Analysis comes in. It's a managed service designed for long-term log storage, powerful searching, and monitoring.
The best part? The integration is automatic.
- How it works: If you provision an IBM Cloud Log Analysis instance in the same region as your Code Engine project, Code Engine will automatically detect it and start forwarding all application and job logs to it.
- Benefits:
- Centralization: All your logs from all your jobs and apps in one place.
- Long-Term Retention: Keep logs for days, weeks, or months based on your plan.
- Powerful Search: Filter by job name, log level, or any text within the log message.
- Alerting: Get notified via Slack, PagerDuty, or email when an error message like our `[WARN]` appears.
You don't need to change your script or your Dockerfile. Simply by having a Log Analysis instance available, your `echo` statements gain superpowers.
Conclusion: Your `echo` Has a Home
Running Bash scripts on IBM Cloud Code Engine doesn't mean sacrificing visibility. The platform is built to capture the fundamental outputs—stdout
and stderr
—that developers have relied on for decades. Your `echo` statements aren't lost in the void; they're waiting for you.
To recap:
- For quick debugging, use
ibmcloud ce jobrun logs
for instant feedback. - For a visual check of past runs, navigate through the IBM Cloud Console.
- For robust, long-term monitoring in production, connect your project to IBM Cloud Log Analysis.
Now you can deploy your scripts with confidence, knowing exactly where to look to verify their success or diagnose their failures.