Core Concepts
Why Automation Matters
Manual configuration doesn't scale. Configuration management tools (Ansible, Puppet) ensure consistency, reduce human error, enable rapid deployment, and provide version-controlled infrastructure. IaC (Infrastructure as Code) treats server config like application code.
Bash vs Python
Bash โ best for system tasks, file manipulation, gluing commands together, quick one-liners and scripts.
Python โ better for complex logic, data processing, API calls, cross-platform code, and when you need data structures (lists, dicts).
Linux+ XK0-006 tests both.
Agentless vs Agent-Based Config Management
- Ansible โ agentless (uses SSH, no software on managed nodes, push model)
- Puppet โ agent-based (Puppet agent installed on nodes, pull model, client checks in to Puppet server)
- Chef โ agent-based (similar to Puppet)
- Salt โ can be both (agent or SSH)
AI Best Practices (New in XK0-006)
Using AI tools (ChatGPT, GitHub Copilot, etc.) for code generation is now an explicit exam objective. Key practices:
- Validate all AI-generated code before running in production
- Understand what generated code does
- Be aware of security risks in AI-suggested code
- Use effective prompts, never blindly trust AI output
Tool Comparison at a Glance
| Tool | Type | Model | Language | Key Command |
|---|---|---|---|---|
| Ansible | Agentless | Push | YAML Playbooks | ansible-playbook |
| Puppet | Agent-based | Pull | Puppet DSL | puppet apply |
| Chef | Agent-based | Pull | Ruby (recipes) | chef-client |
| Salt | Both | Push/Pull | YAML / Python | salt |
Script Basics & Variables
Script Basics
First line (shebang): #!/bin/bash
Make executable: chmod +x script.sh
Run: ./script.sh or bash script.sh
Exit codes: 0=success, non-zero=failure
$? = last command exit code
exit 0 (success), exit 1 (error)
set -e (exit on any error), set -x (debug/trace mode)
Variables
Declare: NAME="Alice" (no spaces around =)
Use: echo $NAME or echo ${NAME}
Command substitution: DATE=$(date +%Y%m%d)
Read-only: readonly VAR=value
Unset: unset VAR
Special: $0=script name, $1 $2=positional args, $#=arg count, $@=all args, $$=script PID, $!=last background PID
Conditionals
if / elif / else
if [ condition ]; then ... elif [ condition ]; then ... else ... fi
Test operators:
-f fileโ file exists-d dirโ directory exists-z "$VAR"โ string empty-n "$VAR"โ string not empty"$A" == "$B"โ string equal$A -eq $Bโ numeric equal$A -gt $Bโ greater than$A -lt $Bโ less than
Use [[ ]] for enhanced tests (regex, no word splitting)
case Statement
case $VAR in pattern1) ... ;; pattern2) ... ;; *) ... ;; esac
Good for multiple choices:
case $OS in "ubuntu") apt install nginx ;; "rhel") dnf install nginx ;; *) echo "Unknown" ;; esac
Cleaner than multiple if/elif for string matching.
Loops & Functions
for Loop
for i in 1 2 3; do echo $i; done
Range: for i in {1..10}; do ...; done
C-style: for ((i=0; i<10; i++)); do ...; done
Over files: for f in /var/log/*.log; do tail -n 5 "$f"; done
Over array: for item in "${ARRAY[@]}"; do ...; done
while Loop
while [ condition ]; do ... done
Read file line by line:
while IFS= read -r line; do echo "$line" done < file.txt
Infinite loop: while true; do ... sleep 60; done
break (exit loop), continue (skip to next iteration)
Functions
function backup() { ... } or backup() { ... }
Call: backup
Args: $1, $2 inside function
Return value: return 0 (exit code only)
Pass strings via echo: result=$(get_value)
Local vars: local var=value
Text Processing
grep / egrep
grep "pattern" file โ search
-i case-insensitive, -r recursive, -v invert/exclude
-c count matches, -l files with matches, -n line numbers
egrep or grep -E for extended regex
grep -P for Perl regex
grep "^Error" /var/log/app.log
sed (stream editor)
sed 's/old/new/g' file โ substitute globally
sed -i 's/old/new/g' file โ in-place edit
sed -n '5,10p' file โ print lines 5โ10
sed '/pattern/d' file โ delete matching lines
sed '3a\new line' file โ insert after line 3
sed 's/[0-9]\+/NUM/g'
awk
awk '{print $1}' file โ print field 1
awk -F: '{print $1}' /etc/passwd โ delimiter :
awk '/pattern/ {print}' file โ filter lines
awk '{sum+=$3} END {print sum}' file โ sum column
awk 'NR==5' file โ print line 5
awk 'NF>0' file โ skip blank lines
Python Basics
Python Virtual Environments
python3 -m venv myenv โ create virtual environment
source myenv/bin/activate โ activate (Linux/Mac)
deactivate โ exit
pip install requests โ install package in venv
pip freeze > requirements.txt โ export deps
pip install -r requirements.txt โ install from file
Isolates project dependencies from system Python.
Python Data Types
str ("hello"), int (42), float (3.14), bool (True/False)
Collections:
listโ ordered, mutable:[1,2,3]tupleโ ordered, immutable:(1,2,3)dictโ key-value:{"key": "val"}setโ unique, unordered:{1,2,3}
Type check: type(var)
Convert: int("42"), str(42), list(range(5))
Python os and sys Modules
import os:
os.listdir('/path')os.path.exists('/file')os.path.join('dir', 'file')os.getcwd()os.environ['HOME']os.makedirs('/new/dir', exist_ok=True)os.rename('old', 'new')
import sys: sys.argv (args), sys.exit(1), sys.platform
Python File I/O
with open('file.txt', 'r') as f:
content = f.read()
for line in f:
print(line.strip())
with open('out.txt', 'w') as f:
f.write("data\n")
# Append: open('log.txt', 'a')
# JSON:
import json
data = json.load(f)
json.dump(data, f)
with statement ensures file is closed even on error.
Git Version Control
Git Basics
git init โ init repo
git clone URL โ clone
git add file or git add . โ stage
git commit -m "message" โ commit
git status โ what's staged/modified
git log --oneline โ history
git diff โ unstaged changes
git diff --staged โ staged changes
Git Branches and Merging
git branch feature-x โ create branch
git checkout feature-x or git switch feature-x โ switch
git checkout -b feature-x โ create + switch
git merge feature-x โ merge into current branch
git rebase main โ reapply commits on top of main
git branch -d feature-x โ delete merged branch
Git Remote and Tags
git remote add origin URL
git push origin main
git pull origin main
git fetch โ download without merge
git tag v1.0.0 โ lightweight tag
git tag -a v1.0.0 -m "Release" โ annotated tag
git push origin --tags โ push all tags
Tags mark releases for CI/CD pipelines.
Ansible
Ansible Architecture
- Agentless โ uses SSH
- Control node runs Ansible
- Managed nodes need only SSH + Python
- Push model (control node pushes config)
- Written in Python, Playbooks in YAML
- Idempotent โ running again = same result
- No daemon required
Ansible Inventory
Default: /etc/ansible/hosts (or custom file)
[webservers] 192.168.1.10 192.168.1.11 [webservers:vars] http_port=80
Test: ansible all -i inventory -m ping
Ansible Playbook Structure
- name: play_name
hosts: webservers
become: yes
tasks:
- name: Install nginx
ansible.builtin.package:
name: nginx
state: present
- name: Start nginx
ansible.builtin.service:
name: nginx
state: started
enabled: yes
Run: ansible-playbook playbook.yml -i inventory
Ansible Key Modules
package/apt/dnfโ install packagesserviceโ manage servicescopyโ copy filestemplateโ Jinja2 templatesfileโ manage files/dirs/permissionsuserโ manage userscommand/shellโ run commandslineinfileโ edit lines in filesdebugโ print variables for troubleshooting
Puppet
Puppet Architecture
- Agent-based โ Puppet agent installed on managed nodes
- Puppet server (master) holds manifests
- Pull model โ agents check in every 30 min by default
- Manifests written in Puppet DSL (domain-specific language)
- Resources describe desired state (declarative)
Puppet Manifest Basics
class nginx {
package { 'nginx':
ensure => installed,
}
service { 'nginx':
ensure => running,
enable => true,
require => Package['nginx'],
}
}
Apply: puppet apply manifest.pp
Node checks in: puppet agent --test
CI/CD Pipelines
CI/CD in Linux Context
CI (Continuous Integration): automatically test code on each commit.
CD (Continuous Delivery/Deployment): automatically deploy tested code.
Tools: GitLab CI (.gitlab-ci.yml), GitHub Actions (.github/workflows/), Jenkins. Linux admins configure CI/CD pipelines that run shell scripts/Ansible playbooks.
Basic Pipeline Example
Trigger: push to main branch
Stages: build โ test โ deploy
Each stage runs shell commands or scripts. Artifacts passed between stages. On success: deploy to production. On failure: notify team, stop pipeline.
AI Best Practices (New in XK0-006)
Responsible AI Code Generation
AI tools (GitHub Copilot, ChatGPT, etc.) can generate shell scripts and Python code.
ALWAYS:
- Review generated code line by line before running
- Test in a non-production environment first
- Verify permissions and security implications
- Understand what the code does โ never run code you don't understand
Prompt Engineering for Sysadmin Tasks
Be specific in prompts:
GOOD: "Write a bash script that backs up /etc daily to /backup with date-stamped filenames and removes backups older than 30 days"
TOO VAGUE: "write a backup script"
Include: target OS, constraints, error handling requirements, output format. Iterate and refine.
AI Limitations in Linux Administration
- AI may suggest outdated commands (deprecated tools, old syntax)
- AI may not know your specific distro version or config
- AI may generate code with security vulnerabilities (hardcoded credentials, insecure permissions, injection risks)
- Always validate against official documentation
- AI is a tool to assist โ not replace โ understanding
Practice Quiz โ 10 Questions
Select the best answer for each question, then click Check Answer. Your total score appears after all questions.
Quiz Complete!
Memory Hooks
High-retention mnemonics for exam-day recall.
Bash Script Checklist
Every script: #!/bin/bash first line โ chmod +x script.sh to run โ use $? to check exit codes โ exit 0 success / exit 1 error โ set -e to stop on errors โ set -x to debug
Bash Special Variables
$0=script name, $1 $2=positional args, $#=number of args, $@=all args (quoted), $$=current PID, $?=last exit code. Most tested: $1, $?, $#
Python venv Pattern
python3 -m venv myenv โ source myenv/bin/activate โ pip install requests โ pip freeze > requirements.txt. Never sudo pip install into system Python for projects!
Ansible vs Puppet
Ansible: SSH, YAML, agentless, push, run: ansible-playbook. Puppet: agent installed, DSL, pull (agent checks in every 30min), run: puppet apply or puppet agent --test
Git Branch Workflow
git checkout -b feature (create+switch). git merge feature (merge to current). git tag -a v1.0.0 -m "release" (tag commit). git push origin --tags. git log --oneline (history)
AI Code Safety
AI-generated code: always review line by line โ test in non-prod โ check security implications โ verify you understand what it does โ THEN use in production. Never run code you don't understand, regardless of source.
Flashcards
Click a card to reveal the answer. Click again to flip back.
Bash: if/elif/else syntax with file and numeric tests
if [ -f /path/file ]; then echo 'exists'; elif [ $COUNT -gt 10 ]; then echo 'high'; else echo 'other'; fi
File tests:
-f (file), -d (dir), -e (exists), -z (empty string), -n (non-empty). Numeric: -eq, -ne, -gt, -lt, -ge, -le. Use [[ ]] for regex: [[ $VAR =~ ^[0-9]+$ ]]
Bash: for loop to process all .log files in /var/log
for f in /var/log/*.log; do echo "Processing: $f"; tail -n 100 "$f" >> /tmp/combined.log; done
Always quote path variables:
"$f" not $f. Range: for i in {1..10}. C-style: for ((i=0; i<10; i++)). Read output: for line in $(cat file)
Python: read a file and filter lines containing 'ERROR'
with open('/var/log/app.log', 'r') as f: for line in f: if 'ERROR' in line: print(line.strip())
with auto-closes file. strip() removes newline. JSON: json.load(f). CSV: csv.reader(f)
Python os module: 5 most useful functions for sysadmin scripts
os.path.exists('/path') โ check existsos.listdir('/path') โ list directoryos.makedirs('/new/dir', exist_ok=True) โ create dirsos.environ.get('HOME') โ env variableos.rename('old', 'new') โ rename/moveos.path.join('dir', 'file') โ build paths safely
Ansible playbook: minimum structure to install and start nginx
- name: Configure webserver hosts: webservers become: yes tasks: - name: Install nginx ansible.builtin.package: name: nginx state: present - name: Start nginx ansible.builtin.service: name: nginx state: started enabled: yes
sed: 4 practical use cases with examples
sed 's/old/new/g' file2. In-place edit:
sed -i 's/old/new/g' file3. Delete matching lines:
sed '/ERROR/d' file4. Print specific lines:
sed -n '10,20p' fileUse
| as delimiter for URLs: sed 's|http://|https://|g'. -i.bak makes a backup before editing.
Git: tag a release and push it to remote
git tag v1.0.0Annotated (preferred):
git tag -a v1.0.0 -m 'Release 1.0.0'Push specific:
git push origin v1.0.0Push all:
git push origin --tagsList:
git tagDelete:
git tag -d v1.0.0Tags trigger CI/CD release pipelines.
AI best practices for Linux automation: 3 dos and 3 don'ts
Review AI code line by line before running.
Test in non-production environment first.
Use specific, detailed prompts (include OS, version, constraints).
DON'T:
Run AI-generated scripts as root without review.
Trust AI with credentials/secrets.
Assume AI-generated code is vulnerability-free.