Skip to content

Commit

Permalink
Server and client overhauled
Browse files Browse the repository at this point in the history
Server-Side Changes
1. Hash Check and Caching:
   - Before Processing: The server now checks if the `received_hash` already exists in the `processed_requests` dictionary. 
   - Cache Hit: If the hash exists, it means the request has been processed before, and the server returns the cached result immediately, avoiding redundant calculations.
   - Cache Miss: If the hash does not exist, the server processes the data, stores the result in the `processed_requests` cache, and then returns the result.

2. Hash Verification:
   - The server calculates a hash of the incoming cities data to ensure data integrity and authenticity. If the calculated hash matches the `received_hash`, it proceeds; otherwise, it returns a hash verification error.

3. Cache Management:
   - To prevent the cache from growing indefinitely, the server checks the cache size and removes the oldest entry if it exceeds the predefined `CACHE_SIZE_LIMIT`.

Client-Side Changes
1. Unified Handling:
   - The client function now attempts both TCP and UDP protocols in a sequential manner. It handles responses from both protocols in one unified function.
   - A global flag, `first_response`, ensures that only the first received response is processed and printed, avoiding duplicate log entries.

2. Error Handling:
   - The client captures and logs any errors encountered during the request, ensuring transparency and better debugging capabilities.

3. Single Console Log:
   - The function logs only one response, regardless of which protocol (TCP or UDP) received it first, ensuring a professional and clean output.
  • Loading branch information
BadNintendo authored Nov 14, 2024
1 parent 94e538a commit dcc7ac5
Show file tree
Hide file tree
Showing 2 changed files with 499 additions and 0 deletions.
228 changes: 228 additions & 0 deletions SingleThreadAdv/client.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,228 @@
import socket
import json
import time
import networkx as nx

class QPRx2025:
def __init__(self, seed=0):
self.seed = seed % 1000000
self.entropy = self.mix_entropy(int(time.time() * 1000))
self.CHARACTERS = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789'
self.LCG_PARAMS = {
'a': 1664525,
'c': 1013904223,
'm': 4294967296
}

def mix_entropy(self, value):
return value ^ (value >> 32) ^ (value >> 16) ^ (value >> 8) ^ value

def lcg(self, a=None, c=None, m=None):
if a is None: a = self.LCG_PARAMS['a']
if c is None: c = self.LCG_PARAMS['c']
if m is None: m = self.LCG_PARAMS['m']
self.seed = (a * self.seed + c + self.entropy) % m
self.entropy = self.mix_entropy(self.seed + int(time.time() * 1000))
return self.seed

def mersenne_twister(self):
MT = [0] * 624
index = 0

def initialize(seed):
MT[0] = seed
for i in range(1, 624):
MT[i] = (0x6c078965 * (MT[i - 1] ^ (MT[i - 1] >> 30)) + i) & 0xffffffff

def generate_numbers():
for i in range(624):
y = (MT[i] & 0x80000000) + (MT[(i + 1) % 624] & 0x7fffffff)
MT[i] = MT[(i + 397) % 624] ^ (y >> 1)
if y % 2 != 0:
MT[i] ^= 0x9908b0df

def extract_number():
nonlocal index
if index == 0:
generate_numbers()
y = MT[index]
y ^= y >> 11
y ^= (y << 7) & 0x9d2c5680
y ^= (y << 15) & 0xefc60000
y ^= y >> 18
index = (index + 1) % 624
return y

initialize(self.seed)
return extract_number()

def quantum_polls_relay(self, max_val):
if not isinstance(max_val, int) or max_val <= 0:
raise ValueError('Invalid max value for QuantumPollsRelay')
lcg_value = self.lcg()
mt_value = self.mersenne_twister()
return ((lcg_value + mt_value) % 1000000) % max_val

def generate_characters(self, length):
if not isinstance(length, int) or length <= 0:
raise ValueError('Invalid length for generateCharacters')
return ''.join(self.CHARACTERS[self.quantum_polls_relay(len(self.CHARACTERS))] for _ in range(length))

def the_options(self, options):
if not isinstance(options, list) or len(options) == 0:
raise ValueError('No options provided')
return options[self.quantum_polls_relay(len(options))]

def the_rewarded(self, participants):
if not isinstance(participants, list) or len(participants) == 0:
raise ValueError('No participants provided')
return participants[self.quantum_polls_relay(len(participants))]

def generate_uuid(self):
bytes_array = [self.mersenne_twister() + self.quantum_polls_relay(256) & 0xff for _ in range(16)]
bytes_array[6] = (bytes_array[6] & 0x0f) | 0x40
bytes_array[8] = (bytes_array[8] & 0x3f) | 0x80
uuid = ''.join(f'{b:02x}' for b in bytes_array)
return f'{uuid[:8]}-{uuid[8:12]}-{uuid[12:16]}-{uuid[16:20]}-{uuid[20:]}'

def custom_hash(self, input, salt='', hash_val=False):
def hashing(input, salt):
combined = f'{input}{salt}'
hashed = 0x811c9dc5
for char in combined:
hashed ^= ord(char)
hashed = (hashed * 0x01000193) & 0xffffffff
return f'{hashed:08x}'

def verify_hash(input, salt, hashed):
return hashing(input, salt) == hashed

if isinstance(hash_val, str):
return verify_hash(input, salt, hash_val)
return hashing(input, salt)

def xor_cipher(self, input, key):
return ''.join(chr(ord(input[i]) ^ ord(key[i % len(key)])) for i in range(len(input)))

# Initialize QPRx2025
qprx = QPRx2025(seed=12345)

# Define the cities data
cities = [
{'name': 'City0', 'x': 0, 'y': 0},
{'name': 'City1', 'x': 10, 'y': 10},
{'name': 'City2', 'x': 20, 'y': 20},
{'name': 'City3', 'x': 30, 'y': 5},
{'name': 'City4', 'x': 40, 'y': 15},
{'name': 'City5', 'x': 50, 'y': 0},
{'name': 'City6', 'x': 60, 'y': 10},
{'name': 'City7', 'x': 70, 'y': 20},
{'name': 'City8', 'x': 80, 'y': 5},
{'name': 'City9', 'x': 90, 'y': 15},
{'name': 'City10', 'x': 100, 'y': 0},
{'name': 'City11', 'x': 110, 'y': 10},
{'name': 'City12', 'x': 120, 'y': 20},
{'name': 'City13', 'x': 130, 'y': 5},
{'name': 'City14', 'x': 140, 'y': 15},
{'name': 'City15', 'x': 150, 'y': 0},
{'name': 'City16', 'x': 160, 'y': 10},
{'name': 'City17', 'x': 170, 'y': 20},
{'name': 'City18', 'x': 180, 'y': 5},
{'name': 'City19', 'x': 190, 'y': 15},
{'name': 'City20', 'x': 200, 'y': 0},
{'name': 'City21', 'x': 210, 'y': 10},
{'name': 'City22', 'x': 220, 'y': 20},
{'name': 'City23', 'x': 230, 'y': 5},
{'name': 'City24', 'x': 240, 'y': 15},
{'name': 'City25', 'x': 250, 'y': 0},
{'name': 'City26', 'x': 260, 'y': 10},
{'name': 'City27', 'x': 270, 'y': 20},
{'name': 'City28', 'x': 280, 'y': 5},
{'name': 'City29', 'x': 290, 'y': 15},
{'name': 'City30', 'x': 300, 'y': 0},
{'name': 'City31', 'x': 310, 'y': 10},
{'name': 'City32', 'x': 320, 'y': 20},
{'name': 'City33', 'x': 330, 'y': 5},
{'name': 'City34', 'x': 340, 'y': 15},
{'name': 'City35', 'x': 350, 'y': 0},
{'name': 'City36', 'x': 360, 'y': 10},
{'name': 'City37', 'x': 370, 'y': 20},
{'name': 'City38', 'x': 380, 'y': 5},
{'name': 'City39', 'x': 390, 'y': 15},
{'name': 'City40', 'x': 400, 'y': 0},
{'name': 'City41', 'x': 410, 'y': 10},
{'name': 'City42', 'x': 420, 'y': 20},
{'name': 'City43', 'x': 430, 'y': 5},
{'name': 'City44', 'x': 440, 'y': 15},
{'name': 'City45', 'x': 450, 'y': 0},
{'name': 'City46', 'x': 460, 'y': 10},
{'name': 'City47', 'x': 470, 'y': 20},
{'name': 'City48', 'x': 480, 'y': 5},
{'name': 'City49', 'x': 490, 'y': 15}
]

# JSON encode the cities data and calculate the hash
cities_data = json.dumps(cities).encode('utf-8')
hash_value = qprx.custom_hash(cities_data.decode('utf-8'))
request_data = json.dumps({'data': cities, 'hash': hash_value}).encode('utf-8')

# Flag to indicate which protocol received the response first
first_response = None

def client():
first_response = None
protocols = ['TCP', 'UDP']
results = {}

for protocol in protocols:
if first_response:
break

try:
if protocol == 'TCP':
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect(('127.0.0.1', 3000))
else:
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
server_address = ('127.0.0.1', 3000)

start_time = time.time()

if protocol == 'TCP':
sock.sendall(request_data)
data = sock.recv(4096)
else:
sock.sendto(request_data, server_address)
data, _ = sock.recvfrom(4096)

end_time = time.time()
processing_time = round((end_time - start_time) * 1000, 2)

response = json.loads(data.decode('utf-8'))

if not first_response:
first_response = protocol
results['protocol'] = protocol
results['response_time'] = processing_time
results['response'] = response

except Exception as e:
if not first_response:
first_response = protocol
results['protocol'] = protocol
results['error'] = str(e)
finally:
if 'sock' in locals():
sock.close()

if 'response' in results:
print(f"{results['protocol']} Response Time (ms): {results['response_time']}")
print("Optimized Path:", results['response']['optimized_path'])
print("Optimized Distance:", results['response']['optimized_distance'])
print("Optimized Array:", results['response']['optimized_array'])
else:
print(f"{results['protocol']} error: {results['error']}")

# Execute the client function
print("Sending data using TCP and UDP...")
client()
Loading

1 comment on commit dcc7ac5

@BadNintendo
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The primary time-consuming aspect in this implementation is the hashing process. Hashing is integral for future data processing, serving as an example of how a server communicates with another server or a client in a secure manner. The implementation leverages hashing to ensure data integrity and confidentiality, thereby making it challenging for unauthorized third parties to intercept and understand the data without resolving the hash.

The key benefits of this method include:

  1. Efficiency through Caching: The server caches the results of previously processed requests using their hash values as identifiers. This ensures that if the same request is made again, the server can quickly return the cached result without recalculating, significantly improving efficiency.

  2. Data Integrity and Security: By calculating and verifying hashes, the server ensures the authenticity and integrity of the data. This added layer of security helps prevent tampering and ensures that the data received is exactly what was sent, without any unauthorized modifications.

  3. Protocol Agnosticism: The use of hashing and caching makes the process independent of the communication protocol (TCP or UDP), allowing for flexibility in how the client and server interact. This approach demonstrates a robust method for future-proof communication between servers and clients.

  4. Performance Optimization: The caching mechanism not only speeds up repeated requests but also manages memory usage effectively by clearing the oldest entries when the cache size limit is exceeded. This balance between performance and resource management ensures the system remains responsive and efficient over time.

In summary, this method showcases a secure, efficient, and protocol-agnostic way for servers and clients to communicate, leveraging hashing to enhance security and performance while maintaining data integrity. This approach is forward-thinking, making it harder for unauthorized entities to interpret the data without the appropriate hashes.

Please sign in to comment.