Alex J. Champandard
149ae4e566
Update training parameters.
9 years ago
Alex J. Champandard
de047d6a74
Fix for images that are not multiples of 2 or 4, depending on the rescaling.
9 years ago
Alex J. Champandard
29bfe58d73
Merged documentation changes.
9 years ago
Alex J. Champandard
9a228c1d78
Merge pull request #44 from wieczorek1990/master
...
Fixed typo.
9 years ago
Alex J. Champandard
b24fa7490d
Merge pull request #47 from alexjc/v0.2
...
Release 0.2: Tiled Rendering, 1:1 Noise/Blur Fixing
9 years ago
Alex J. Champandard
03914db364
Fix formatting, minor tweaks to Docker build files for release.
9 years ago
Alex J. Champandard
448e7b93dc
Fix progress output when tiled rendering, changed default zoom level.
9 years ago
Alex J. Champandard
cabaaeeefe
Add training scripts for networks currently being trained, for release v0.2.
9 years ago
Alex J. Champandard
1ad40b6d71
Merge branch 'master' into v0.2
9 years ago
Alex J. Champandard
ac49676415
Add tiled rendering with padding, no feather-blending but looks good enough.
9 years ago
Łukasz Wieczorek
ed20dcefd5
Fixed typo
9 years ago
Alex J. Champandard
095fe42dc3
Add tiled rendering, currently with no padding for each tile.
9 years ago
Alex J. Champandard
d18c08f1b5
Integrated reflection padding instead of zero padding for extra quality during training and inference.
9 years ago
Alex J. Champandard
90c0b7ea43
Fix padding code, more reliable for specific upscale/downscale combinations.
9 years ago
Alex J. Champandard
3b2a6b9d8d
Add extra padding on input to avoid zero-padding. Experiment with training values from ENet (segmentation).
9 years ago
Alex J. Champandard
7924cc4a85
Improve display and filenames for saving output.
9 years ago
Alex J. Champandard
93e5a41d9a
Fix and optimize pre-processing of images.
9 years ago
Alex J. Champandard
11ba505252
Fix for gradient clipping code.
9 years ago
Alex J. Champandard
cf65207a2e
Use full range of tanh output rather than [-0.5, +0.5], avoids clipping.
9 years ago
Alex J. Champandard
0c31e53731
Fix suggested alias for relative paths. Closes #37 #28 .
9 years ago
Alex J. Champandard
34f8e629c2
Improve the alias used to invoke docker, so it's more robust to directory locations and input paths.
9 years ago
Alex J. Champandard
c610623b11
Add gradient clipping, helpful for preventing problems with extreme parameters/architectures.
9 years ago
Alex J. Champandard
02d2fca6c5
Corrected value for adversarial loss. Don't refactor math the day after stopping coffee.
9 years ago
Alex J. Champandard
f2494f8078
Add new downscale layers, separate from upscale steps. Renamed --scales to --zoom for inference.
9 years ago
Alex J. Champandard
0c9937a317
Merge pull request #22 from msfeldstein/master
...
Remove cnmem theano flag.
9 years ago
Alex J. Champandard
064f9dd589
Add three image pre-processing options, improve loading code.
9 years ago
Michael Feldstein
fef84c5b44
Remove cnmem theano flag since it doesn't work if you're sharing GPU with display.
9 years ago
Alex J. Champandard
5ef872b876
Add warning for files that may be too large for 4x.
9 years ago
Alex J. Champandard
a5ad2c25e6
Merge pull request #18 from alexjc/training
...
Training Improvements
9 years ago
Alex J. Champandard
17fcad8d28
Refactor of changes related to training.
9 years ago
Alex J. Champandard
2b67daedb6
Merge pull request #12 from dribnet/generic_seeds
...
Move generation of seeds out of training network.
9 years ago
Alex J. Champandard
cad5eff572
Merge pull request #11 from dribnet/save_every_epoch
...
Added --save-every-epoch option
9 years ago
Alex J. Champandard
a9b0cd9887
Merge pull request #9 from zuphilip/patch-1
...
Fix some typos in README.
9 years ago
Alex J. Champandard
fcc5e87858
Merge pull request #4 from dribnet/valid_dir
...
Add valid dir when necessary.
9 years ago
Alex J. Champandard
1478977f18
Merge pull request #10 from OndraM/fix-duplicate-param
...
Fix duplicate param definition.
9 years ago
Tom White
8f5167d235
Fix enhancer.process to pass img, seed
9 years ago
Tom White
37cb208374
Move generation of seeds out of training network
...
This moves the generation of the image seeds out of the
training network and into the DataLoader. Currently seeds
are computed as bilinear downsamplings of the original image.
This is almost functionly equivalent to the version
it replaces, but opens up new possibilities at training
time because the seeds are now decoupled from the netork.
For example, seeds could be made with different interpolations
or even with other transformations such as image compression.
9 years ago
Tom White
b05ee6ad08
Added --save-every-epoch option
9 years ago
Tom White
c5053806bd
Add valid dir when necessary
9 years ago
Ondřej Machulda
eb25e737cf
Fix duplicate param definition
9 years ago
Philipp Zumstein
f83e69e96a
Fix some typos in README
9 years ago
Alex J. Champandard
f68f04fb1c
Improve instructions to train custom models so a new file is output and existing one is not loaded. Use --model parameter!
9 years ago
Alex J. Champandard
203917d122
Switch default to small model to reduce memory usage.
9 years ago
Alex J. Champandard
2b5fc8f51d
Add docker instructions, fix for slow compute in CPU image.
9 years ago
Alex J. Champandard
74cb95609e
Fix for docker build using latest Miniconda and Python 3.5 explicitly.
9 years ago
Alex J. Champandard
99c767b7e2
Add docker configuration files for CPU and GPU.
9 years ago
Alex J. Champandard
bf22450b8d
Update documentation for new --train usage, minor improvements.
9 years ago
Alex J. Champandard
4c55c48f62
Add argument for specifying training images, cleaned up file handling.
9 years ago
Alex J. Champandard
1c38f2ca31
New meme-friendly image and link to demo.
9 years ago
Alex J. Champandard
b1c054ce9f
Improve the README for applying and training models.
9 years ago