SIEM Program Documentation

Overview

The Security Information and Event Management (SIEM) program is designed to collect, analyze, and report security logs from various network protocols. It helps organizations monitor their IT infrastructure, detect anomalies, and respond to security threats effectively.

Features

How It Works

The SIEM program operates by gathering logs from various sources, analyzing them for anomalies, and generating reports. Here’s a step-by-step breakdown of its functionality:

  1. Log Collection: The program continuously collects logs from various sources such as web servers, databases, and network devices. It supports multiple protocols like HTTP, FTP, and SSH.
  2. Log Storage: Collected logs are stored in a SQLite database, allowing for easy retrieval and management.
  3. Log Analysis: The program analyzes the logs for anomalies, such as high severity events, and generates alerts if suspicious activities are detected.
  4. Reporting: It generates daily and custom reports that summarize recent logs and alert information, which can be used for further analysis and compliance purposes.

Installation Instructions

On Linux

  1. Ensure you have Python 3 and SQLite installed. You can check by running:
    python3 --version
  2. Download the SIEM script:
    wget https://example.com/siem.py
  3. Run the script:
    python3 siem.py

On Termux

  1. Open Termux and update the package list:
    pkg update
  2. Install Python:
    pkg install python
  3. Install SQLite:
    pkg install sqlite
  4. Download the SIEM script:
    wget https://example.com/siem.py
  5. Run the script:
    python siem.py

Source Code

The following is the complete source code for the SIEM program:


import os
import json
import sqlite3
import time
from datetime import datetime, timedelta

# Database setup
conn = sqlite3.connect('siem.db')
c = conn.cursor()
c.execute('''CREATE TABLE IF NOT EXISTS logs
             (timestamp TEXT, source TEXT, event TEXT, severity TEXT, protocol TEXT)''')
c.execute('''CREATE TABLE IF NOT EXISTS alerts
             (timestamp TEXT, alert TEXT)''')
c.execute('''CREATE TABLE IF NOT EXISTS reports
             (timestamp TEXT, report TEXT)''')
conn.commit()

# Function to collect logs from various sources
def collect_logs(source, protocol):
    log_event = {
        "timestamp": datetime.now().isoformat(),
        "source": source,
        "event": f"Sample event for {protocol}",
        "severity": "Low",  # This can be enhanced based on actual log analysis
        "protocol": protocol
    }
    return log_event

# Function to store logs
def store_log(log_event):
    c.execute("INSERT INTO logs (timestamp, source, event, severity, protocol) VALUES (?, ?, ?, ?, ?)",
              (log_event['timestamp'], log_event['source'], log_event['event'], log_event['severity'], log_event['protocol']))
    conn.commit()

# Function to analyze logs
def analyze_logs():
    c.execute("SELECT * FROM logs")
    all_logs = c.fetchall()
    if all_logs:
        print("Collected Logs:")
        for log in all_logs:
            print(log)
            detect_anomalies(log)

# Function to detect anomalies and generate alerts
def detect_anomalies(log):
    if log[3] == "High":  # Assuming severity is in column 3
        alert_message = f"High severity alert detected from {log[1]} at {log[0]}"
        store_alert(alert_message)

# Function to store alerts
def store_alert(alert_message):
    timestamp = datetime.now().isoformat()
    c.execute("INSERT INTO alerts (timestamp, alert) VALUES (?, ?)", (timestamp, alert_message))
    conn.commit()

# Function to generate scheduled reports
def generate_report():
    timestamp = datetime.now().isoformat()
    c.execute("SELECT * FROM logs WHERE timestamp > ?", (datetime.now() - timedelta(days=1),))
    recent_logs = c.fetchall()
    report_content = f"Daily Report - {timestamp}\n"
    report_content += "Recent Logs:\n"
    for log in recent_logs:
        report_content += f"{log}\n"
    c.execute("INSERT INTO reports (timestamp, report) VALUES (?, ?)", (timestamp, report_content))
    conn.commit()
    print(report_content)

# Function to gather logs from various protocols
def gather_from_sources():
    sources = {
        'Web Server': ['HTTP', 'HTTPS'],
        'FTP Server': ['FTP', 'SFTP'],
        'SSH Server': ['SSH'],
        'DNS Server': ['DNS'],
        'Mail Server': ['SMTP', 'POP3', 'IMAP'],
        'Network Device': ['SNMP', 'NetFlow', 'ICMP', 'BGP', 'DHCP', 'ARP', 'GRE', 'ICMPv6'],
        'Authentication Server': ['RADIUS', 'LDAP', 'Kerberos'],
        'IoT Device': ['MQTT', 'CoAP'],
        'Voice Server': ['SIP', 'RTP', 'RTCP'],
        'Time Server': ['NTP'],
        'Tunneling Protocols': ['PPTP', 'GRE'],
        'Transport Protocols': ['SCTP', 'DCCP'],
        'Security Protocols': ['IPsec', 'SSL/TLS']
    }
    
    for source, protocols in sources.items():
        for protocol in protocols:
            log_event = collect_logs(source, protocol)
            store_log(log_event)

# Main loop for SIEM
def main():
    while True:
        gather_from_sources()
        analyze_logs()
        generate_report()  # Generate a daily report
        time.sleep(30)  # Collect logs every 30 seconds

if __name__ == "__main__":
    main()
    

Usage Instructions

Once the SIEM program is installed and running, follow these instructions to make the most of its features:

  1. Start the Program: Run the script using python3 siem.py.
  2. Monitor Logs: The program will continuously collect logs from configured sources.
  3. Check Alerts: Monitor the generated alerts for any high-severity events detected during log analysis.
  4. Review Reports: Access daily reports generated by the system for insights into recent logs and security events.

Notes and Considerations

Data Privacy: Ensure that the logs collected do not violate privacy regulations. Anonymize sensitive information as necessary.
Performance Impact: Monitor the performance of your systems while the SIEM program is running. High log volumes may affect system performance.
Security Updates: Regularly update the SIEM program to include the latest security patches and improvements.
Backup Logs: Implement a strategy to back up logs periodically to prevent data loss.