bs-constant now only in one place and updated README
This commit is contained in:
parent
4f250f61c3
commit
cd89945d55
@ -6,4 +6,5 @@ I made it because I want to try to break it.
|
||||
This will work iff I succeed in building a PPT-discriminator for sha256 from randomness
|
||||
As my first approach this discriminator will be based on an LSTM-network.
|
||||
Update: This worked out way better than expected; given long enought sequences (128 Bytes are more than enough) we can discriminate successfully in 100% of cases.
|
||||
Update: I did an upsie in the training-code and the discriminator is actually shit.
|
||||
Update 2: I did an upsie in the training-code and the discriminator is actually shit.
|
||||
Update 3: Turns out: sha256 produces fairly high quality randomness and this project seems to have failed...
|
||||
|
@ -10,7 +10,7 @@ import random
|
||||
import shark
|
||||
from model import Model
|
||||
|
||||
bs = int(256/8)
|
||||
bs = shark.bs
|
||||
|
||||
class Model(nn.Module):
|
||||
def __init__(self):
|
||||
|
2
model.py
2
model.py
@ -3,6 +3,8 @@ from torch import nn
|
||||
from torch import nn, optim
|
||||
from torch.utils.data import DataLoader
|
||||
|
||||
import shark
|
||||
|
||||
class Model(nn.Module):
|
||||
def __init__(self):
|
||||
super(Model, self).__init__()
|
||||
|
6
shark.py
6
shark.py
@ -3,12 +3,6 @@ import math
|
||||
import os
|
||||
import random
|
||||
|
||||
# Shark is a sha256+xor based encryption.
|
||||
# I made it because I want to try to break it.
|
||||
# (Precisely: Show it does not provide semantic security, because it is not IND-CPA-secure)
|
||||
# This will work iff I succeed in building a PPT-discriminator for sha256 from randomness
|
||||
# As my first approach this discriminator will be based on an LSTM-network.
|
||||
|
||||
bs = int(256/8)
|
||||
|
||||
def xor(ta,tb):
|
||||
|
Loading…
Reference in New Issue
Block a user