To run the provided Python program, you need to ensure that the following prerequisites are met on your system. This section provides instructions for installation on both Linux and Termux.
Follow these steps to install the required libraries and dependencies on a Linux system:
sudo apt update
sudo apt install python3 python3-pip python3-setuptools
sudo apt install libpcap-dev
sudo apt install build-essential
pip3 install scapy requests beautifulsoup4
Explanation:
python3
: Python 3 interpreter.python3-pip
: Package installer for Python 3.python3-setuptools
: Utilities for building and distributing Python packages.libpcap-dev
: Development files for libpcap, needed for Scapy.build-essential
: Essential package for compiling software.scapy
: For sending and manipulating network packets.requests
: For making HTTP requests.beautifulsoup4
: For parsing HTML content.Use the following commands to set up the necessary environment on Termux:
pkg update
pkg install python
pkg install clang libffi libffi-dev
pip install scapy requests beautifulsoup4
Explanation:
python
: Python interpreter.clang
: C compiler, required for building certain packages.libffi
and libffi-dev
: Libraries for Foreign Function Interface, needed for some Python packages.scapy
, requests
, beautifulsoup4
: Same as described for Linux.Ensure that the script has executable permissions. You can set this using:
chmod +x your_script_name.py
This command makes the Python script executable from the terminal.
This document explains the Python script provided. The script involves generating random network addresses, sending network packets, and performing web scraping.
import random
import time
import requests
from bs4 import BeautifulSoup
from scapy.all import *
# Function to generate a random MAC address
def generate_random_mac():
return ':'.join(['{:02x}'.format(random.randint(0, 255)) for _ in range(6)])
# Function to generate a random IPv4 address
def generate_random_ipv4():
return '.'.join(str(random.randint(0, 255)) for _ in range(4))
# Function to generate a random IPv6 address
def generate_random_ipv6():
return ':'.join(['{:x}'.format(random.randint(0, 65535)) for _ in range(8)])
# Function to generate a random ToS for IPv4 (RFC 791)
def generate_random_tos():
precedence = random.randint(0, 7) # 3 bits for precedence
delay = random.randint(0, 1) # 1 bit for delay
throughput = random.randint(0, 1) # 1 bit for throughput
reliability = random.randint(0, 1) # 1 bit for reliability
reserved = 0 # 2 bits reserved
tos = (precedence << 5) | (delay << 4) | (throughput << 3) | (reliability << 2) | reserved
return tos
# Function to generate random Traffic Class for IPv6 (RFC 2460)
def generate_random_traffic_class():
qos = random.randint(0, 63) # 6 bits for QoS
flow_label = random.randint(0, 3) # 2 bits for Flow Label
traffic_class = (qos << 2) | flow_label
return traffic_class
# Function to send an Ethernet frame with a random MAC address
def send_ethernet_frame(destination_mac):
source_mac = generate_random_mac()
frame = Ether(src=source_mac, dst=destination_mac)
sendp(frame, verbose=False)
print(f"Sent Ethernet frame from {source_mac} to {destination_mac}")
# Function to send an IPv4 packet with random ToS
def send_ipv4_packet(destination_ip):
tos = generate_random_tos()
packet = IP(dst=destination_ip, tos=tos) / ICMP() / Raw(b'Hello')
send(packet, verbose=False)
print(f"Sent IPv4 packet to {destination_ip} with ToS={tos}")
# Function to send an IPv6 packet with random Traffic Class
def send_ipv6_packet(destination_ip):
traffic_class = generate_random_traffic_class()
packet = IPv6(dst=destination_ip, tc=traffic_class) / ICMPv6EchoRequest() / Raw(b'Hello')
send(packet, verbose=False)
print(f"Sent IPv6 packet to {destination_ip} with Traffic Class={traffic_class}")
# Function to generate a random User-Agent string
def generate_random_user_agent():
user_agents = [
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Edge/91.0.864.48 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Firefox/89.0",
"Mozilla/5.0 (Android 11; Mobile; rv:90.0) Gecko/90.0 Firefox/90.0",
"Mozilla/5.0 (Android 10; Mobile; rv:89.0) Gecko/89.0 Firefox/89.0"
]
return random.choice(user_agents)
# Function to generate a realistic Referer header
def generate_realistic_referer(keyword):
referers = [
"https://www.google.com/search?q=",
"https://www.bing.com/search?q=",
"https://www.yahoo.com/search?p=",
"https://www.duckduckgo.com/?q=",
"https://www.ask.com/web?q="
]
return random.choice(referers) + keyword
# Function to perform a search and process links
def perform_search_and_process_links(keyword):
search_url = f"https://www.google.com/search?q={keyword}+site:miralishahidi.ir"
headers = {
'User-Agent': generate_random_user_agent(),
'X-Forwarded-For': generate_random_ipv4(),
'Referer': generate_realistic_referer(keyword)
}
# Randomly select HTTP version
http_version = random.choice(["http1.1", "http2"])
# Send request to the search URL
print(f"Fetching page content from: {search_url} with HTTP version: {http_version}")
try:
response = requests.get(search_url, headers=headers)
response.raise_for_status()
except requests.RequestException as e:
print(f"Failed to fetch page content. Error: {e}")
return
page_content = response.text
# Process and extract links
soup = BeautifulSoup(page_content, 'html.parser')
links = []
for link in soup.find_all('a', href=True):
href = link['href']
if href.startswith('/url?q='):
cleaned_link = href[7:].split('&')[0]
links.append(cleaned_link)
# Process the links
for link in links:
print(f"Extracted link: {link}")
try:
response = requests.get(link, headers=headers)
response.raise_for_status()
print(f"HTTP Status Code: {response.status_code}")
print("Link content preview:")
print(response.text[:200]) # Preview content of the link
except requests.RequestException as e:
print(f"Failed to fetch link content. Error: {e}")
print("-----")
# Main function to run the program
def main():
while True:
# Send Ethernet frame
send_ethernet_frame('ff:ff:ff:ff:ff:ff') # Broadcast
# Send IPv4 packet
ipv4_dest = generate_random_ipv4()
send_ipv4_packet(ipv4_dest)
# Send IPv6 packet
ipv6_dest = generate_random_ipv6()
send_ipv6_packet(ipv6_dest)
# Search and process links
keywords = [
"IT", "ICT", "OT", "IIOT", "IOT", "network", "cybersecurity", "AI", "machine+learning", "data+science",
"cloud+computing", "blockchain", "automation", "digital+transformation", "IoT", "big+data",
"analytics", "software+development", "IT+consulting", "networking", "virtualization",
"system+integration", "tech+trends", "IT+strategy", "smart+devices", "enterprise+IT",
"cyber+defense", "data+protection", "IT+infrastructure", "technology+solutions",
"security+services", "cloud+storage", "IT+support", "tech+innovation", "software+engineering",
"information+security", "IT+management", "digital+marketing", "IT+services", "enterprise+solutions",
"IT+architecture", "IT+operations", "mobile+computing", "IT+project+management", "IT+training",
"tech+consulting", "network+security", "IT+systems", "data+analytics", "IT+compliance",
"IT+governance", "IT+trends", "IT+support+services", "IT+outsourcing", "technology+consulting"
]
keyword = random.choice(keywords)
perform_search_and_process_links(keyword)
time.sleep(10) # 10-second delay to avoid network overload
if __name__ == "__main__":
main()
1. Imports and Setup:
The script imports necessary libraries: random
for generating random values, time
for delays, requests
for HTTP requests, BeautifulSoup
from bs4
for HTML parsing, and scapy
for packet manipulation.
2. MAC and IP Address Generation:
generate_random_mac()
generates a random MAC address. generate_random_ipv4()
and generate_random_ipv6()
generate random IPv4 and IPv6 addresses, respectively.
3. Packet Functions:
send_ethernet_frame(destination_mac)
sends an Ethernet frame with a random source MAC address. send_ipv4_packet(destination_ip)
sends an IPv4 packet with a random ToS value. send_ipv6_packet(destination_ip)
sends an IPv6 packet with random QoS/CoS values.
4. HTTP Headers Generation:
generate_random_user_agent()
returns a random User-Agent string. generate_realistic_referer(keyword)
returns a realistic Referer header based on a keyword.
5. Web Scraping:
perform_search_and_process_links(keyword)
performs a Google search, processes the results to extract links, and fetches the content of those links.
6. Main Loop:
main()
continuously sends Ethernet frames, IPv4, and IPv6 packets, and performs web searches. It pauses for 10 seconds between iterations to avoid overwhelming the network.