Skip to main content

Linux Pipes va Filters

Pipes Nima?

Pipe (|) - bir buyruqning natijasini boshqa buyruqqa uzatish uchun ishlatiladi. Bu Linux'ning eng kuchli xususiyatlaridan biri.

# Asosiy sintaks
buyruq1 | buyruq2 | buyruq3

# Misol
ls -la | grep ".txt" # Faqat .txt fayllarni ko'rsatish
ps aux | grep nginx # Nginx jarayonlarini topish

Asosiy Filter Commands

1. grep - Qidirish va Filterlash

# Matn ichidan qidirish
cat file.txt | grep "ERROR" # ERROR so'zini qidirish
ps aux | grep apache # Apache jarayonlarini topish
cat log.txt | grep -v "INFO" # INFO qatorlarini chiqarib tashlash
cat log.txt | grep -i error # Katta-kichik harfni inobatga olmasdan

2. sort - Saralash

# Turli xil saralash usullari
cat names.txt | sort # Alifbo bo'yicha
cat numbers.txt | sort -n # Raqamlar bo'yicha
ps aux | sort -k3 -nr # CPU bo'yicha (3-ustun, teskari)
ls -la | sort -k5 -n # Fayl hajmi bo'yicha

3. uniq - Takrorlanuvchilarni olib tashlash

# Uniq faqat saralangan ma'lumotlar bilan ishlaydi!
cat file.txt | sort | uniq # Takrorlarni olib tashlash
cat file.txt | sort | uniq -c # Takrorlar sonini ko'rsatish
cat file.txt | sort | uniq -d # Faqat takrorlangan qatorlar

4. cut - Ustunlarni ajratish

# Ma'lum ustunlarni olish
cat /etc/passwd | cut -d: -f1 # Foydalanuvchi nomlarini
cat /etc/passwd | cut -d: -f1,3 # 1 va 3-ustunlar
echo "hello world" | cut -c1-5 # Birinchi 5 ta belgi

5. awk - Kuchli matn qayta ishlash

# Ustunlar bilan ishlash
ps aux | awk '{print $1, $11}' # 1 va 11-ustunlar
df -h | awk '{print $1, $5}' # Disk va foydalanish foizi

# Shartli amallar
ps aux | awk '$3 > 5.0 {print $1, $2, $3}' # CPU > 5% bo'lgan jarayonlar
cat access.log | awk '$9 == 404 {print $1}' # 404 xatolar

6. head va tail - Boshi va oxiri

# Ma'lumotning bir qismini olish
ps aux | head -10 # Birinchi 10 qator
cat large_file.txt | tail -20 # Oxirgi 20 qator
tail -f /var/log/syslog | grep "ERROR" # Real vaqtda xatolarni kuzatish

7. wc - Hisoblash

# Turli xil hisoblashlar
cat file.txt | wc -l # Qatorlar soni
ps aux | grep nginx | wc -l # Nginx jarayonlari soni
ls -1 | wc -l # Fayllar soni

Amaliy Misollar

1. Log tahlili

# Eng ko'p kiradigan IP manzillar
cat access.log | awk '{print $1}' | sort | uniq -c | sort -nr | head -10

# 404 xatolar soni
cat access.log | grep "404" | wc -l

# Eng ko'p so'ralgan sahifalar
cat access.log | awk '{print $7}' | sort | uniq -c | sort -nr | head -10

2. Tizim monitoring

# Eng ko'p CPU ishlatayotgan jarayonlar
ps aux | sort -k3 -nr | head -10

# Disk joy holati
df -h | grep -vE '^Filesystem|tmpfs' | awk '{print $1, $5}'

# Faol ulanishlar soni
ss -tuln | grep LISTEN | wc -l

3. Fayl operatsiyalari

# Eng katta fayllar
ls -la | sort -k5 -nr | head -10

# Ma'lum kengaytmali fayllar soni
ls -1 | grep "\.txt$" | wc -l

# Bo'sh bo'lmagan fayllar
find . -name "*.log" | xargs ls -la | awk '$5 > 0 {print $9, $5}'

4. Docker container tahlili

# Running container'lar soni
docker ps | tail -n +2 | wc -l

# Eng ko'p ishlatilayotgan image'lar
docker ps -a --format "{{.Image}}" | sort | uniq -c | sort -nr

# Container'lar status bo'yicha
docker ps -a --format "{{.Status}}" | awk '{print $1}' | sort | uniq -c

Murakkab Pipeline Misol

#!/bin/bash
# Nginx log tahlili

LOG_FILE="/var/log/nginx/access.log"

echo "=== Nginx Log Tahlili ==="

# Top 10 IP manzillar
echo "En ko'p kiruvchi IP'lar:"
cat "$LOG_FILE" | \
awk '{print $1}' | \
sort | \
uniq -c | \
sort -nr | \
head -10 | \
awk '{printf "%-15s %s marta\n", $2, $1}'

# HTTP status kodlari taqsimoti
echo -e "\nHTTP Status kodlar:"
cat "$LOG_FILE" | \
awk '{print $9}' | \
sort | \
uniq -c | \
sort -nr | \
awk '{printf "%-5s %s marta\n", $2, $1}'

# Soatlik trafik taqsimoti
echo -e "\nSoatlik trafik:"
cat "$LOG_FILE" | \
awk '{print $4}' | \
cut -c14-15 | \
sort | \
uniq -c | \
awk '{printf "%02d:00 soat - %s so'rov\n", $2, $1}'

Best Practices

1. Pipeline'ni bosqichma-bosqich test qiling

# Har bir bosqichni alohida sinab ko'ring
cat data.txt | grep "pattern" # 1-bosqich
cat data.txt | grep "pattern" | sort # 2-bosqich
cat data.txt | grep "pattern" | sort | uniq # 3-bosqich

2. Katta fayllar uchun samaradorlik

# Katta fayllar uchun LC_ALL=C ishlatish tezroq
cat large_file.txt | LC_ALL=C sort | uniq

# head/tail bilan cheklash
cat large_file.txt | head -1000 | grep "pattern"

3. Xatolarni ushlash

# Pipeline xatolarini aniqlash
set -o pipefail
command1 | command2 | command3 || echo "Xatolik yuz berdi!"

4. Debugging uchun tee ishlatish

# Oraliq natijalarni saqlash
cat input.txt | tee step1.txt | sort | tee step2.txt | uniq > final.txt

Muhim Eslatmalar

  • uniq faqat saralangan ma'lumotlar bilan ishlaydi
  • Pipeline'da bitta xatolik butun zanjirni to'xtatadi
  • Katta fayllar bilan ishlashda head/tail ishlatib cheklang
  • awk va sed juda kuchli, lekin murakkab operatsiyalar uchun
  • Har doim test ma'lumotlarda sinab ko'ring

Bu guide yordamida siz Linux'da pipes va filters'ni samarali ishlatishni o'rganasiz!