1
0
Fork 0
mirror of https://github.com/containers/ansible-podman-collections.git synced 2026-02-04 07:11:49 +00:00

Add inventory plugins for buildah and podman (#963)

Add inventory plugins for buildah and podman, unit tests and functional CI tests.
---------

Signed-off-by: Sagi Shnaidman <sshnaidm@redhat.com>
This commit is contained in:
Sergey 2025-08-13 16:48:50 +03:00 committed by GitHub
parent fb76891c50
commit 6ee2f3891b
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
57 changed files with 3856 additions and 8899 deletions

View file

@ -0,0 +1,51 @@
### Buildah connection playbook examples
This folder contains self-contained Ansible playbooks demonstrating how to build images with Buildah while executing steps inside a working container through the Buildah connection plugin (`ansible_connection: containers.podman.buildah`). Each example shows a realistic workflow and explains the options used.
Prerequisites
- Podman and Buildah installed (rootless supported)
- Ansible installed (`ansible-core` recommended)
- Network access to pull base images
How these playbooks work
- A working container is created on localhost using `buildah from <image>`.
- The playbook dynamically adds a temporary inventory host whose `ansible_connection` is `containers.podman.buildah` and `remote_addr` is the Buildah working container ID.
- File operations and commands within the container use the Buildah connection plugin (no SSH), so modules like `copy`, `command`, and `shell` act inside the container.
- Image metadata/commit/push operations are executed on localhost with `buildah config/commit/push` referencing the same container ID.
Common variables
- `buildah_base_image`: base image to start the working container (varies per example)
- `image_name`: final image name (and optional tag)
- `ansible_buildah_working_directory`: working directory inside the container for all build steps (passed to the connection plugin)
Examples
1) build_node_ai_api.yml — Node.js AI prediction API image without a Dockerfile
- Starts from `node:14`, copies `package.json` and app sources to `/app`, runs `npm install`, sets image metadata, commits to `my-ai-node-app:latest`.
- Options highlighted:
- `ansible_connection: containers.podman.buildah`
- `ansible_buildah_working_directory: /app`
2) build_go_ai_multistage.yml — Multi-stage Go build to a minimal runtime image
- Stage 1: compile inside `golang:1.21` working container, fetch the compiled binary to host.
- Stage 2: start `alpine:latest`, copy binary into the container, configure CMD and exposed port, commit `minimal-ai-inference:latest`.
- Shows how to move artifacts between stages using the connection plugins `fetch_file` and normal `copy`.
3) build_ai_env_with_ansible.yml — Create a consistent AI dev environment image with an Ansible role
- Starts from `python:3.11-slim`, then applies role `roles/ai-dev-env` which installs common data-science packages inside the container using raw/pip commands.
- Demonstrates layering higher-level Ansible logic on top of a Buildah working container.
4) gitlab_ci_build_model_image.yml — CI-friendly image build using Buildah connection (template)
- Builds and optionally pushes an image for a simple model serving app (`app.py`, `requirements.txt`).
- Designed to be called from GitLab CI; see the included `.gitlab-ci.yml` for a minimal job that runs `ansible-playbook`.
Running an example
```bash
cd playbook/examples
ansible-playbook build_node_ai_api.yml -e image_name=my-ai-node-app:latest
```
Notes
- The Buildah connection runs commands with `buildah run <container> …` under the hood; metadata operations such as `buildah config`, `commit`, and `push` still run on localhost and reference the working container ID.
- If you prefer persistent names, set `container_name` (Buildah will use named working containers). Otherwise, the container ID returned by `buildah from` is used.

View file

@ -0,0 +1,41 @@
---
- name: Build a consistent AI dev environment image using Ansible and Buildah connection
hosts: localhost
gather_facts: false
vars:
base_image: "python:3.11-slim"
image_name: "ai-dev-env:latest"
workdir: "/workspace"
tasks:
- name: Start Buildah working container
command: buildah from {{ base_image }}
register: from_out
- set_fact:
container_id: "{{ from_out.stdout | trim }}"
- name: Configure working directory
command: buildah config --workingdir {{ workdir }} {{ container_id }}
- name: Add working container as dynamic host (Buildah connection)
add_host:
name: "buildcntr_{{ container_id }}"
ansible_connection: containers.podman.buildah
ansible_host: "{{ container_id }}"
ansible_buildah_working_directory: "{{ workdir }}"
- name: Provision AI environment inside container using role
import_role:
name: ai-dev-env
delegate_to: "buildcntr_{{ container_id }}"
- name: Set container metadata
shell: |
buildah config --env "JUPYTER_TOKEN=ansible" {{ container_id }}
buildah config --port 8888 {{ container_id }}
buildah config --cmd "jupyter lab --ip=0.0.0.0 --no-browser" {{ container_id }}
- name: Commit image
command: buildah commit {{ container_id }} {{ image_name }}

View file

@ -0,0 +1,84 @@
---
- name: Build minimal Go-based AI inference image with multi-stage using Buildah connection
hosts: localhost
gather_facts: false
vars:
build_stage_image: "golang:1.23"
runtime_image: "alpine:latest"
image_name: "minimal-ai-inference:latest"
source_dir: "{{ playbook_dir }}/go_app"
build_workdir: "/app"
tasks:
- name: Ensure source exists
stat:
path: "{{ source_dir }}/main.go"
register: go_src
- name: Fail if source missing
fail:
msg: "Provide a Go main.go under {{ source_dir }}"
when: not go_src.stat.exists
- name: Start build stage container (Go toolchain)
command: buildah from {{ build_stage_image }}
register: build_from
- set_fact:
build_container: "{{ build_from.stdout | trim }}"
- name: Add build container to inventory
add_host:
name: "build_stage_{{ build_container }}"
ansible_connection: containers.podman.buildah
ansible_host: "{{ build_container }}"
ansible_buildah_working_directory: "{{ build_workdir }}"
- name: Configure workdir
command: buildah config --workingdir {{ build_workdir }} {{ build_container }}
- name: Copy sources
copy:
src: "{{ source_dir }}/main.go"
dest: "{{ build_workdir }}/main.go"
delegate_to: "build_stage_{{ build_container }}"
- name: Build static binary
shell: CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o inference-engine main.go
args:
chdir: "{{ build_workdir }}"
delegate_to: "build_stage_{{ build_container }}"
- name: Fetch compiled binary to host
fetch:
src: "{{ build_workdir }}/inference-engine"
dest: "{{ playbook_dir }}/inference-engine"
flat: true
delegate_to: "build_stage_{{ build_container }}"
- name: Remove build container
command: buildah rm {{ build_container }}
- name: Start runtime container
command: buildah from {{ runtime_image }}
register: run_from
- set_fact:
run_container: "{{ run_from.stdout | trim }}"
- name: Copy binary into runtime container
command: buildah copy {{ run_container }} {{ playbook_dir }}/inference-engine /inference-engine
- name: Configure runtime image
shell: |
buildah config --cmd "/inference-engine" {{ run_container }}
buildah config --port 8080 {{ run_container }}
- name: Commit image
command: buildah commit {{ run_container }} {{ image_name }}
- name: Cleanup host artifact
file:
path: "{{ playbook_dir }}/inference-engine"
state: absent

View file

@ -0,0 +1,76 @@
---
- name: Build Node.js AI prediction API without Dockerfile using Buildah connection
hosts: localhost
gather_facts: false
vars:
buildah_base_image: "node:24"
image_name: "my-ai-node-app:latest"
workdir: "/app"
# App sources live under examples/node_app/
app_src_dir: "{{ playbook_dir }}/node_app"
tasks:
- name: Ensure app sources exist
stat:
path: "{{ app_src_dir }}/package.json"
register: app_sources
- name: Fail if sources are missing
fail:
msg: "Example sources not found under {{ app_src_dir }}. Provide package.json and app.js."
when: not app_sources.stat.exists
- name: Start Buildah working container
command: buildah from {{ buildah_base_image }}
register: from_out
changed_when: true
- name: Set container id fact
set_fact:
container_id: "{{ from_out.stdout | trim }}"
- name: Add working container as a dynamic host to inventory
add_host:
name: "buildcntr_{{ container_id }}"
ansible_connection: containers.podman.buildah
ansible_host: "{{ container_id }}"
ansible_buildah_working_directory: "{{ workdir }}"
- name: Configure image metadata (workdir)
command: buildah config --workingdir {{ workdir }} {{ container_id }}
- name: Copy package files first for better layer caching
copy:
src: "{{ item }}"
dest: "{{ workdir }}/"
with_items:
- "{{ app_src_dir }}/package.json"
- "{{ app_src_dir }}/package-lock.json"
delegate_to: "buildcntr_{{ container_id }}"
- name: Install dependencies
command: npm install
delegate_to: "buildcntr_{{ container_id }}"
- name: Copy application code
copy:
src: "{{ app_src_dir }}/app.js"
dest: "{{ workdir }}/app.js"
delegate_to: "buildcntr_{{ container_id }}"
- name: Expose port and set default command
shell: |
buildah config --port 3000 {{ container_id }}
buildah config --cmd "node app.js" {{ container_id }}
- name: Commit image
command: buildah commit {{ container_id }} {{ image_name }}
- name: Show resulting image
command: buildah images --format '{{"{{.Name}}:{{.Tag}}"}}\t{{"{{.Size}}"}}'
register: imgs
changed_when: false
- debug:
var: imgs.stdout_lines

View file

@ -0,0 +1,19 @@
package main
import (
"fmt"
"math/rand"
"net/http"
)
func predict(w http.ResponseWriter, r *http.Request) {
fmt.Fprintf(w, "prediction=%f\n", rand.Float64())
}
func main() {
http.HandleFunc("/predict", predict)
fmt.Println("Inference engine listening on :8080")
http.ListenAndServe(":8080", nil)
}

View file

@ -0,0 +1,13 @@
from flask import Flask, jsonify
import random
app = Flask(__name__)
@app.get("/predict")
def predict():
return jsonify({"prediction": random.random()})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000)

View file

@ -0,0 +1,2 @@
flask==3.0.0

View file

@ -0,0 +1,11 @@
const express = require('express');
const app = express();
app.get('/predict', (req, res) => {
// Dummy prediction endpoint
res.json({ prediction: Math.random() });
});
app.listen(3000, () => console.log('AI prediction API listening on :3000'));

View file

@ -0,0 +1,18 @@
{
"name": "ai-prediction-api",
"version": "1.0.0",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "ai-prediction-api",
"version": "1.0.0",
"license": "MIT",
"dependencies": {
"express": "^4.18.2"
}
}
}
}

View file

@ -0,0 +1,12 @@
{
"name": "ai-prediction-api",
"version": "1.0.0",
"description": "Sample Node.js AI prediction API for Buildah example",
"main": "app.js",
"license": "MIT",
"dependencies": {
"express": "^4.18.2"
}
}

View file

@ -0,0 +1,10 @@
---
# Minimal example role to provision an AI dev environment inside a Buildah working container
- name: Ensure pip present
shell: python3 -m ensurepip || true
- name: Upgrade pip
shell: python3 -m pip install --upgrade pip
- name: Install common data science packages
shell: python3 -m pip install --no-cache-dir numpy pandas jupyterlab