Essential Linux Command Line Skills Every User Should Know

Essential Linux Command Line Skills Every User Should Know


The command line is the most powerful interface to a Linux system. While GUIs are convenient, the terminal gives you precision, speed, and the ability to automate anything. This guide covers the essential commands you’ll use daily.

# Print current directory
pwd

# List files
ls
ls -la          # Detailed list with hidden files
ls -lah         # Human-readable file sizes
ls -lt          # Sort by modification time

# Change directory
cd /var/log
cd ~            # Home directory
cd -            # Previous directory
cd ..           # Parent directory

File Operations

# Create files
touch newfile.txt
echo "content" > file.txt       # Create/overwrite
echo "more" >> file.txt         # Append

# Create directories
mkdir new-folder
mkdir -p path/to/nested/folder  # Create parent directories

# Copy files and directories
cp file.txt backup.txt
cp -r folder/ backup/           # Copy directory recursively

# Move or rename
mv old.txt new.txt
mv file.txt /tmp/

# Delete
rm file.txt
rm -r folder/                   # Remove directory
rm -rf folder/                  # Force remove (use carefully!)

# Find files
find / -name "*.log" -type f
find . -size +100M              # Files larger than 100MB
find . -mtime -7                # Modified in last 7 days
find . -name "*.tmp" -delete    # Find and delete

Viewing and Editing Files

# View entire file
cat file.txt

# View with pagination
less file.txt          # Navigate with arrows, q to quit
more file.txt

# View first/last lines
head -20 file.txt      # First 20 lines
tail -20 file.txt      # Last 20 lines
tail -f /var/log/syslog  # Follow (live updates)

# Count lines, words, characters
wc -l file.txt         # Line count
wc -w file.txt         # Word count

# Edit files
nano file.txt          # Simple, beginner-friendly editor
vim file.txt           # Powerful, modal editor

Text Processing

Linux shines at processing text. These tools are incredibly powerful when combined.

grep — Search in files

grep "error" logfile.txt
grep -r "TODO" ./src/           # Recursive search in directory
grep -i "warning" file.txt      # Case insensitive
grep -n "pattern" file.txt      # Show line numbers
grep -c "error" file.txt        # Count matches
grep -v "debug" file.txt        # Invert match (exclude lines)
grep -E "error|warning" file.txt  # Extended regex (OR)

sed — Stream editor

# Replace text (print to stdout)
sed 's/old/new/g' file.txt

# Replace in place
sed -i 's/old/new/g' file.txt

# Delete lines matching pattern
sed -i '/^#/d' file.txt         # Remove comment lines

# Replace only on specific lines
sed '3s/old/new/' file.txt      # Only line 3

awk — Column processing

# Print specific columns
awk '{print $1, $3}' file.txt

# Process CSV
awk -F',' '{print $2}' data.csv

# Sum a column
awk '{sum += $1} END {print sum}' numbers.txt

# Filter rows
awk '$3 > 100' data.txt

sort, uniq, cut

# Sort
sort file.txt
sort -n numbers.txt              # Numeric sort
sort -r file.txt                 # Reverse sort
sort -u file.txt                 # Sort and deduplicate

# Remove consecutive duplicates
uniq file.txt
uniq -c file.txt                 # Count occurrences

# Extract fields
cut -d',' -f1,3 data.csv        # Fields 1 and 3 from CSV
cut -c1-10 file.txt              # First 10 characters per line

Piping and Redirection

The real power of Linux comes from combining commands with pipes.

# Pipe output to another command
cat access.log | grep "404" | wc -l

# Chain multiple filters
ps aux | grep nginx | grep -v grep | awk '{print $2}'

# Redirect output to file
ls > files.txt                   # Overwrite
ls >> files.txt                  # Append

# Redirect errors
command 2> errors.txt            # Stderr to file
command > output.txt 2>&1        # Both stdout and stderr

# Discard output
command > /dev/null 2>&1

# Use output as input
xargs -I {} echo "File: {}" < filelist.txt

Permissions and Ownership

# View permissions
ls -la
# Output: -rwxr-xr-x 1 user group 4096 Jan 10 12:00 script.sh
#          ^^^^^^^^^^^
#          owner/group/others: read/write/execute

# Change permissions (numeric)
chmod 755 script.sh              # rwxr-xr-x
chmod 644 config.txt             # rw-r--r--
chmod 600 private.key            # rw-------

# Change permissions (symbolic)
chmod +x script.sh               # Add execute for all
chmod u+w file.txt               # Add write for owner
chmod go-r secret.txt            # Remove read from group/others

# Change ownership
sudo chown user:group file.txt
sudo chown -R www-data:www-data /var/www/

Common permission patterns

NumericSymbolicUse Case
755rwxr-xr-xExecutable scripts, directories
644rw-r—r—Regular files
600rw-------Private keys, passwords
700rwx------Private directories, .ssh

System Monitoring

# Disk usage
df -h                            # Filesystem usage
du -sh folder/                   # Directory size
du -sh * | sort -rh | head -10   # Top 10 largest items

# Memory
free -h

# CPU and processes
top                              # Basic process viewer
htop                             # Interactive (install: apt install htop)

# Running processes
ps aux                           # All processes
ps aux | grep nginx              # Find specific process

# Kill processes
kill PID                         # Graceful termination
kill -9 PID                      # Force kill
killall nginx                    # Kill all by name
pkill -f "python script"         # Kill by pattern

# System info
uname -a                         # Kernel info
hostname                         # Machine name
uptime                           # How long system has been running
lsb_release -a                   # Distribution info

Networking

# Check connectivity
ping google.com
ping -c 4 8.8.8.8               # Send exactly 4 packets

# DNS lookup
dig example.com
nslookup example.com
host example.com

# Network interfaces
ip addr                          # Show all interfaces
ip route                         # Show routing table

# Open ports and connections
ss -tlnp                         # TCP listening ports
ss -tunp                         # All connections with processes

# Download files
curl -O https://example.com/file.zip
curl -s https://api.example.com/data | jq .   # Fetch JSON API

wget https://example.com/file.zip
wget -r --no-parent https://example.com/docs/  # Recursive download

# Transfer files between machines
scp file.txt user@server:/path/
scp -r folder/ user@server:/path/

rsync -avz folder/ user@server:/path/          # Efficient sync
rsync -avz --delete src/ dest/                 # Mirror (delete extra files)

Package Management

Debian/Ubuntu (apt)

sudo apt update                  # Refresh package list
sudo apt upgrade                 # Upgrade installed packages
sudo apt install nginx           # Install
sudo apt remove nginx            # Remove (keep config)
sudo apt purge nginx             # Remove completely
sudo apt autoremove              # Clean unused dependencies
apt search keyword               # Search
apt show nginx                   # Package info

Red Hat/Fedora (dnf)

sudo dnf update
sudo dnf install nginx
sudo dnf remove nginx
dnf search keyword

Arch (pacman)

sudo pacman -Syu                 # Update everything
sudo pacman -S nginx             # Install
sudo pacman -R nginx             # Remove
pacman -Ss keyword               # Search

Archives and Compression

# Create tar archive
tar -cf archive.tar folder/

# Create compressed archive
tar -czf archive.tar.gz folder/        # gzip
tar -cjf archive.tar.bz2 folder/       # bzip2

# Extract
tar -xf archive.tar
tar -xzf archive.tar.gz
tar -xzf archive.tar.gz -C /target/    # Extract to specific directory

# List contents without extracting
tar -tzf archive.tar.gz

# Zip/unzip
zip -r archive.zip folder/
unzip archive.zip

Shell Scripting Basics

Create backup.sh:

#!/bin/bash
set -e  # Exit on error

# Variables
BACKUP_DIR="/backups"
DATE=$(date +%Y-%m-%d)
SOURCE="/home/user/data"

# Create backup directory
mkdir -p "$BACKUP_DIR"

# Create archive
tar -czf "$BACKUP_DIR/backup-$DATE.tar.gz" "$SOURCE"

# Keep only last 7 days of backups
find "$BACKUP_DIR" -name "backup-*.tar.gz" -mtime +7 -delete

echo "Backup complete: $BACKUP_DIR/backup-$DATE.tar.gz"

Make it executable and run:

chmod +x backup.sh
./backup.sh

Conditional logic

if [ -f "/etc/nginx/nginx.conf" ]; then
    echo "Nginx config exists"
elif [ -d "/etc/apache2" ]; then
    echo "Apache is installed"
else
    echo "No web server found"
fi

Loops

# For loop
for file in *.log; do
    echo "Processing: $file"
    gzip "$file"
done

# While loop
while read -r line; do
    echo "Line: $line"
done < input.txt

Cron jobs (scheduled tasks)

# Edit cron schedule
crontab -e

# Format: minute hour day month weekday command
# Examples:
0 3 * * * /home/user/backup.sh          # Daily at 3 AM
*/15 * * * * /usr/bin/health-check.sh    # Every 15 minutes
0 0 * * 0 /usr/bin/weekly-cleanup.sh     # Sundays at midnight

Quick Reference

TaskCommand
Find filesfind . -name "*.txt"
Search in filesgrep -r "pattern" .
Disk usagedf -h
Directory sizedu -sh folder/
Running processesps aux
Kill processkill -9 PID
Check portsss -tlnp
Download filecurl -O URL
Archive foldertar -czf out.tar.gz folder/
File permissionschmod 755 file
System infouname -a

Conclusion

The Linux command line is a skill that compounds over time. Every command you learn makes the next task faster. Start with file navigation and basic operations, then gradually incorporate text processing, piping, and scripting. Within weeks, you’ll find yourself reaching for the terminal first — because it’s simply faster and more powerful than any GUI for most system tasks.