16 Commits (87304c93a6d373d3e3474f50967deb8a4ae39fa5)

Author SHA1 Message Date
Alex J. Champandard 87304c93a6 Use traditional learning rate decaying rather than fast-restarts, works better when training continuously adapting GAN.
9 years ago
Alex J. Champandard 3809e9b02a Add loading of images into a buffer, using multiple fragments per JPG loaded. Works well with larger datasets like OpenImages, fully GPU bound.
9 years ago
Alex J. Champandard 619fad7f3c Add loading parameters from saved models. Clean up learning-rate code.
9 years ago
Alex J. Champandard f978f058dd Add descriptions for command-line arguments.
9 years ago
Alex J. Champandard d8c9292b80 Tweak code for generalized super-resolution.
9 years ago
Alex J. Champandard 9558a11397 Add command-line for super-resolving images without training.
9 years ago
Alex J. Champandard 03a6813e95 Enable adversarial loss during training.
9 years ago
Alex J. Champandard 6542f435b4 Stabilize the training, tuned the architecture.
9 years ago
Alex J. Champandard b52530cf20 Add support for PRELU (default), optional load/save.
9 years ago
Alex J. Champandard 4d82c7f157 Improve console output.
9 years ago
Alex J. Champandard 8770113f9a Add support for loading/saving models. Fixed regressions.
9 years ago
Alex J. Champandard 58e1bed4a6 Improve display and code structure.
9 years ago
Alex J. Champandard 749b467f94 Switch to sub-pixel deconvolution layer.
9 years ago
Alex J. Champandard 9f13695050 Add discriminator network too.
9 years ago
Alex J. Champandard 3f24714039 Add residual blocks to generator.
9 years ago
Alex J. Champandard 058e3d3b9e Add simple but working prototype, perceptual loss.
9 years ago