Linux isn’t only the bedrock of DevOps, it is practically the headstrong behind the whole IT realm. From cloud engineering to cybersecurity, the data scientist setups to backend ops - the real control lies in the terminal. Complete knowledge of Linux commands is vital for all DevOps masterminds. These commands constitute tools for interacting with servers, managing deployment, and debugging the infrastructure.
The following are 50 commands, grouped and bandied with descriptions, and commonly used options.
File and Directory Management
1. ls: Displays the contents of a directory, files, and subdirectories.
>> ls -l /var/log
Options:
- l: use a long listing format
-a, --all: is used to list the hidden files, do not ignore entries starting with '.'
-h, --human-readable: display details in human-readable format when used with options like -l or -s, etc...
2. cd: Changes the current working directory. It's essential for navigating through the file system hierarchy.
>> cd /etc/nginx
Options:
cd - : returns to the former directory you had visited
cd .. : move one directory back in your current directory tree
cd ~ : takes you the home directory
3. pwd: Prints the complete path of the current working directory. This is handy for corroborating your current position in the file system.
>> pwd
4. mkdir: Creates a new directory at the given path. Handy for sorting out lines or creating brochure structures for operations.
>> mkdir -p /tmp/project/logs
Options:
-p, --parents: This makes parent directories as requested, no error if it already exists.
5. rm: Removes files or directories from the file system. Handle it precisely, as it can delete files irrevocably.
>> rm- rf/ tmp/ design
Options:
-r, -R, --recursive: It removes directories and their subdirectory contents recursively.
-f, --force: This option never prompts during the deletion process, but it ignores nonexistent files and arguments, resulting in absolute deletion.
6. cp: This command clones files and directories from one position to another. It is frequently used in backup or deployment scripts.
>> cp -r source/ destination/
Options
-r, -R, --recursive: It copies directories and their subdirectory contents recursively.
-u, --update: It copies only when the SOURCE file is newer than the destination file or when the destination file is missing.
7. mv: Moves or renames files and directories. Useful when reorganizing file systems or renaming resources.
>> mv oldname.txt newname.txt
8. touch: Creates a new empty file or updates the timestamp of an existing file. Generally used to produce log or flag lines.
>> touch deploy.log
9. find: Recursively searches for lines and directories matching a pattern. Extremely useful in large systems to locate logs, configs, etc.
>> find /var/log -name "*.log"
Options
- type: Specific file formats are returned.
-mtime n: Return the file only if the file's data was last modified less than, more than, or exactly n*24 hours ago.
10. tree: Displays the directory structure in a tree-like format. Great for visualizing nested directories.
>> tree /etc
Note: This may not be natively supported and needs to be installed using your supported package manager.
File Content Management
11. cat: Concatenates and displays file contents. Useful for snappily viewing or combining files.
>> cat /etc/hosts
12. less / more: Displays file contents one page at a time. Ideal for reading long files without loading everything at formerly.
>> less /var/log/syslog
13. head: Displays the first few lines of a file, defaulting to 10 lines. Frequently used to exercise configuration or log lines.
>> head -n 20 file.txt
14. tail: Displays the last few lines of a file. Extremely useful for covering live logs.
>> tail -f /var/log/nginx/access.log
Options
-f, --follow: follow the output in real-time by appending data as the file grows.
15. wc: Counts lines, words, and characters in files. Helpful in data processing and validation.
>> wc -l access.log
16. cut: Removes or excerpts specific fields from each line in a file. Useful for parsing logs or structured text.
>> cut -d':' -f1 /etc/passwd
17. grep: searches for specific patterns in text files. Ideal for log analysis and filtering outputs.
>> grep 'ERROR' /var/log/app.log
Options:
-i, --ignore-case: Ignore case distinctions in patterns and input data, so that characters that differ only in case match each other.
-no-ignore-case: Do not ignore case distinctions in patterns and input data. This is the default.
-n, --line-number: Prefix each line of output with the 1-based line number within its input file.
18. awk: An important pattern scanning and text processing tool. Great for extracting and analyzing structured data.
>> awk '{print $1}' /etc/passwd
19. sed: A stream editor used to perform basic text transformations. Commonly used for finding and replacing tasks.
>> sed -i 's/foo/bar/g' file.txt
20. diff: Compares two files line by line to show differences. Handy during code or config review.
>> diff old.txt new.txt
21. sort: The sort command is used to arrange lines in a text file (or output from another command) in a specified order, typically alphabetical or numerical.
>> sort names.txt
Options:
-r, --reverse: Reverse the result of comparisons for sort operation.
Permissions and Ownership
22. chmod: Modifies train or directory permissions. Vital for securing scripts, files, and directories.
>> chmod +x script.sh
23. chown: Changes the owner and group of a file. Useful for setting correct permissions in multi-user systems.
>> chown user : group file
24. umask: Sets default permission bits for new files. Helps control access policies systematically.
>> umask 022
25. stat: Displays detailed information about a file's properties. Useful for checking timestamps and permissions.
>> stat file.txt
Process operation
26. ps: Displays information about active processes. Useful for checking if services are running.
>> ps aux | grep nginx
27. top: Real-time view of system processes. Allows sorting by CPU, memory operation, etc.
>> top
28. htop: Interactive interpretation of the top with a more user-friendly interface. Useful for resource monitoring.
>> htop
29. kill: Sends a signal to a process, generally to stop it. Commonly used to terminate hung processes.
>> kill -9 <pid>
30. nice/ renice: Starts a process with a given precedence or changes the precedence of a handling process.
>> nice -n 10 ./script.sh
31. jobs/ bg/ fg: Manages background and foreground processes in the shell. Useful when multitasking in CLI.
Jobs command is used to list the jobs that you are running in the background and in the foreground.
>> jobs
bg is a process control command that resumes a suspended process while keeping it running in the background. The user can run a job in the background by adding a “&” symbol at the end of the command.
>> bg %1
fg command moves a background job in the current shell environment into the foreground.
>> fg %1
32. nohup: Runs a command immune to hangups and continues even after logout. Great for long-running jobs.
>> nohup ./script.sh &
Networking
33. ping: Sends ICMP packets to test connectivity to another host. Basic tool for network diagnostics.
>> ping google.com
34. curl: Interacts with URLs and APIs via various protocols. Frequently used in testing APIs or downloading data.
>> curl -I https://example.com
Options:
-I, --head: Fetch the headers only.
-X, --request: Specifies a custom request method to use when communicating with the HTTP server, which defaults to GET.
-d, --data: Sends the specified data in a POST request to the HTTP server, in the same way that a browser does when a user has filled in an HTML form and presses the submit button.
35. wget: Non-interactive tool for downloading files from the web. Supports HTTP, HTTPS, and FTP.
>> wget https://example.com/file.zip
36. netstat / ss: Shows open ports, socket statistics, and network connections. Helps diagnose network-related issues.
>> ss -tuln
37. traceroute: Traces the path packets take to a network host. Useful for finding bottlenecks in the route.
>> traceroute google.com
38. dig: Performs DNS lookups and displays server responses. Good for troubleshooting DNS issues.
>> dig example.com
39. host: Simple DNS lookup utility. A quick alternative to dig.
>> host example.com
40. ip addr: Displays or configures network interfaces. IP is the modern and preferred command.
>> ip addr show
41. telnet: The telnet command is a command-line utility used to establish a TCP/IP connection to a remote host and interact with it as if you were physically present at the command prompt.
>> telnet [domain_name or IP_address] [port_number]
Fragment and System Monitoring
42. df: Reports file system disk space usage. Helps you monitor available and used storage.
>> df -h
Options:
-h, --human-readable: print sizes in human-readable format in powers of 1024.
43. du: Shows disk usage of files and directories. Useful for identifying space hogs.
>> du -sh /var/log
44. free: Displays memory and swap usage statistics. Helps identify memory bottlenecks.
>> free -m
45. uptime: Tells how long the system has been up and the average load. Useful for quick health checks.
>> uptime
46. iostat: Provides detailed system performance criteria. Useful for performance tuning and diagnostics.
>> iostat
Archiving and Compression
47. tar: Archives multiple files into one. Generally used for backups or transferring train sets.
>> tar -czvf backup.tar.gz /etc
Options:
-c, --create: Create a new archive.
-z: Compress the archive using gzip.
-v: Show the progress in the terminal.
-f, --file: Allows you to name your archive file.
-x: Extract a archive file.
48. gzip/ gunzip: Compresses or decompresses files using the gzip algorithm. Saves space and speeds up transfers.
>> gzip file.txt, gunzip file.txt.gz
49. zip/ unzip: Compresses or extracts zip files. Frequently used in Windows-compatible file operations.
>> zip -r archive.zip folder/
>> unzip archive.zip
50. scp: Securely copies files over SSH. Ideal for transferring files between servers.
>> scp file.txt user@host:/path/
Conclusion
These 50 commands are indispensable for any DevOps professional. Use them daily, integrate them into scripts, and customize them for automation. Mastery comes with practice, so explore each command further and understand its deeper functionalities.