- Fixed JAX/PyTorch dtype mismatch for successful training
- Added experiment plan with paper-accurate hyperparameters
- Created batch submission and monitoring scripts
- Cleaned up log files and updated gitignore
- Ready for systematic paper replication
- Update SLURM scripts to use correct CUDA modules (devel/cuda/12.4, intel compiler)
- Add JAX downgrade to 0.4.35 for CuDNN 9.5.1 compatibility
- Fix JAX_PLATFORMS environment variable (cuda vs gpu,cpu)
- Update README with cluster-specific JAX installation steps
- Tested successfully: Both PyTorch and JAX working on GPU with full training
- Add complete HoReKa installation guide without conda dependency
- Include SLURM job script with GPU configuration and account setup
- Add helper scripts for job submission and environment testing
- Integrate wandb logging with both online and offline modes
- Support MuJoCo Playground environments for humanoid control
- Update README with clear separation of added vs original content