Commit Graph

95 Commits

Author SHA1 Message Date
199fab7875 New shell-command, nltk for keyword extraction from description 2021-12-11 13:58:01 +01:00
36baf1aaec Bug: Typo in train 2021-12-11 11:54:25 +01:00
81fa6ca4d6 Faster training (earlier stopping) 2021-12-11 11:52:49 +01:00
1f5dea6aff Added a requirements.txt 2021-12-08 11:54:27 +01:00
16cc68dfed chmod +x (again) 2021-12-05 19:57:05 +01:00
da9569fd4c Added 'competence' command (displays recommenders and justifications of
their scores)
2021-12-05 19:56:26 +01:00
aa95a9b16b chmod +x 2021-12-05 19:54:03 +01:00
39930d1233 kp 2021-12-05 19:53:30 +01:00
f2fad859dc Added a progress-command (and newly trained weights) 2021-11-24 22:35:39 +01:00
ef7fceacea Better 'Gradientenabstieg' 2021-11-23 20:51:24 +01:00
5f7366e556 Newly trained net 2021-11-07 11:41:01 +01:00
7f51f139f2 Actually, we will also train on whitepapers (because otherwise I would
have to fix an ugly bug)
2021-10-17 15:52:52 +02:00
ce99e5301b Smashed bug with argument-parsing 2021-10-17 15:51:26 +02:00
8778cfdae6 Also dont train on whitepapers 2021-10-17 15:50:33 +02:00
3588587c92 Earlier removal of whitepapers (dont extrapolate from them) 2021-10-17 15:48:44 +02:00
65e8948202 Filter out the whitepapers I have in my libary 2021-10-17 15:47:37 +02:00
153aa434d5 Implemented listScores command 2021-10-13 15:10:12 +02:00
23cc62ac01 Newly trained net 2021-10-12 20:18:24 +02:00
6689c3bf6b Removed ~ from score and shorter training 2021-10-12 20:10:48 +02:00
f27d89e43d Added a dark-mode 2021-10-05 18:25:27 +02:00
5fc161fcc0 Newly trained net 2021-10-05 18:08:55 +02:00
7fd1f4fa3f Better shit-detection and mitigation when training 2021-10-05 18:08:32 +02:00
f5763fe7da Newly trained net 2021-10-05 17:10:12 +02:00
73364935f0 Bug Fix: A function name was overloaded 2021-10-04 12:27:12 +02:00
4c4821c7a3 Newly trained net 2021-09-30 00:09:11 +02:00
ca7b2fd87c Faster termination when training 2021-09-30 00:08:53 +02:00
1bd0597291 Newly trained net 2021-09-27 00:22:45 +02:00
8c4e35bf41 Bug fix in training 2021-09-27 00:22:35 +02:00
b771204b1b Newly trained net 2021-09-26 23:18:32 +02:00
8964fa2c6a No early stopping if still bad 2021-09-26 23:18:21 +02:00
1303f302d3 Dont train if we are already perfect 2021-09-26 23:17:03 +02:00
9f9d2390b3 Less verbose training 2021-09-26 23:15:50 +02:00
ade61980b4 Newly trained net 2021-09-26 23:14:25 +02:00
98aaec1e22 Train using gradient 2021-09-26 23:13:43 +02:00
06a0461e93 Better pruning of tags and recommenders 2021-09-26 16:51:17 +02:00
9d03b37e16 New trained net 2021-09-26 15:53:41 +02:00
f3240147d5 training now terminates earlier when stagnating (can be disabled via
flag)
2021-09-26 15:52:54 +02:00
787404c134 Newly trained net 2021-09-26 14:31:24 +02:00
9126bbcc14 Stronger linear seperation 2021-09-26 14:31:00 +02:00
147a78302f New trained net 2021-09-26 12:46:58 +02:00
f1c887275c removeUselessSeries and stronger linear seperation 2021-09-26 12:46:30 +02:00
2bb2e15b73 Tweaks to recommendation-display 2021-09-25 20:15:14 +02:00
31cc8ce31c New trained net 2021-09-25 00:54:59 +02:00
92d1b33ee3 Made recommendation-graph way better; tweaks to bounds-loss 2021-09-25 00:54:09 +02:00
32bac42c83 Newly trained net 2021-09-24 23:40:55 +02:00
6ed1d41e2c Added new tag-based version of recommendation-visualizer and added a
boundary-loss
2021-09-24 23:39:55 +02:00
5a8d76bc3d Even better trained net 2021-09-24 19:25:54 +02:00
12fd3dffc9 New trained net 2021-09-24 19:21:31 +02:00
212a30298a Stronger regression-loss; more parameter-freedome 2021-09-24 19:16:29 +02:00
22ca039502 New trained net 2021-09-24 19:14:21 +02:00