@[TOC]
原文地址
原文地址:Texture Synthesis Using Convolutional Neural Networks
原文翻译:https://blog.csdn.net/cicibabe/article/details/70991588
主要观点
具体实现
"Texture Synthesis Using Convolutional Neural Networks" - Tensorflow implementation
In summary, we will try to generate texture based on the sample texture image from the scratch random noisy image.
Step 1: Preprocessing the input image
Step 2: Computing the output for all the layers for the input image.
Step 3: What is loss function in this problem and computing the loss function.
Step 4: Running Tensorflow model to minimize the loss and optimize the input noise variable.
Step 5: Post processing and displaying the image
Step 6: Automating the stuffs
Step 7: Plotting the successful results.
step4 : Running Tensorflow model to minimize the loss and optimize the input noise variable.
C:\Users\lenovo\Jupyter Notbook\Texture-Synthesis-Using-Convolutional-Neural-Networks-master\tensorflow_vgg\vgg16.npy
npy file loaded
build model started
build model finished: 0s
Epoch: 100/10000 Loss: 5529756400000000.0
Epoch: 200/10000 Loss: 1722265000000000.0
Epoch: 300/10000 Loss: 1083315040000000.0
Epoch: 400/10000 Loss: 790199800000000.0
Epoch: 500/10000 Loss: 570981500000000.0
Epoch: 600/10000 Loss: 410827380000000.0
Epoch: 700/10000 Loss: 297603500000000.0
Epoch: 800/10000 Loss: 220161130000000.0
Epoch: 900/10000 Loss: 168884270000000.0
Epoch: 1000/10000 Loss: 135689240000000.0
Epoch: 1100/10000 Loss: 114025070000000.0
Epoch: 1200/10000 Loss: 99368410000000.0
Epoch: 1300/10000 Loss: 88925840000000.0
Epoch: 1400/10000 Loss: 81028790000000.0
Epoch: 1500/10000 Loss: 74668535000000.0
Epoch: 1600/10000 Loss: 69297890000000.0
Epoch: 1700/10000 Loss: 64574640000000.0
Epoch: 1800/10000 Loss: 60283637000000.0
Epoch: 1900/10000 Loss: 56295533000000.0
Epoch: 2000/10000 Loss: 52524070000000.0
Epoch: 2100/10000 Loss: 48925063000000.0
Epoch: 2200/10000 Loss: 45459750000000.0
Epoch: 2300/10000 Loss: 42112372000000.0
Epoch: 2400/10000 Loss: 38868896000000.0
Epoch: 2500/10000 Loss: 35743810000000.0
Epoch: 2600/10000 Loss: 32742763000000.0
Epoch: 2700/10000 Loss: 29852806000000.0
Epoch: 2800/10000 Loss: 27081340000000.0
Epoch: 2900/10000 Loss: 24441560000000.0
Epoch: 3000/10000 Loss: 21946470000000.0
Epoch: 3100/10000 Loss: 19601977000000.0
Epoch: 3200/10000 Loss: 17414532000000.0
Epoch: 3300/10000 Loss: 15395135000000.0
Epoch: 3400/10000 Loss: 13546737000000.0
Epoch: 3500/10000 Loss: 11869900000000.0
Epoch: 3600/10000 Loss: 10365760000000.0
Epoch: 3700/10000 Loss: 9022828000000.0
Epoch: 3800/10000 Loss: 7828565000000.0
Epoch: 3900/10000 Loss: 6765571000000.0
Epoch: 4000/10000 Loss: 5823080000000.0
Epoch: 4100/10000 Loss: 4990639000000.0
Epoch: 4200/10000 Loss: 4259004500000.0
Epoch: 4300/10000 Loss: 3619611700000.0
Epoch: 4400/10000 Loss: 3064605200000.0
Epoch: 4500/10000 Loss: 2590721700000.0
Epoch: 4600/10000 Loss: 2192407400000.0
Epoch: 4700/10000 Loss: 1864649100000.0
Epoch: 4800/10000 Loss: 1597569000000.0
Epoch: 4900/10000 Loss: 1380381600000.0
Epoch: 5000/10000 Loss: 1202861100000.0
Epoch: 5100/10000 Loss: 1058127100000.0
Epoch: 5200/10000 Loss: 939163900000.0
Epoch: 5300/10000 Loss: 840591540000.0
Epoch: 5400/10000 Loss: 758178300000.0
Epoch: 5500/10000 Loss: 687998500000.0
Epoch: 5600/10000 Loss: 627689200000.0
Epoch: 5700/10000 Loss: 575582900000.0
Epoch: 5800/10000 Loss: 530246400000.0
Epoch: 5900/10000 Loss: 490397470000.0
Epoch: 6000/10000 Loss: 455180550000.0
Epoch: 6100/10000 Loss: 423790480000.0
Epoch: 6200/10000 Loss: 395681070000.0
Epoch: 6300/10000 Loss: 370407100000.0
Epoch: 6400/10000 Loss: 347494480000.0
Epoch: 6500/10000 Loss: 326646300000.0
Epoch: 6600/10000 Loss: 307627660000.0
Epoch: 6700/10000 Loss: 290129280000.0
Epoch: 6800/10000 Loss: 273938200000.0
Epoch: 6900/10000 Loss: 258948480000.0
Epoch: 7000/10000 Loss: 245060340000.0
Epoch: 7100/10000 Loss: 232265710000.0
Epoch: 7200/10000 Loss: 220370030000.0
Epoch: 7300/10000 Loss: 209295410000.0
Epoch: 7400/10000 Loss: 198971560000.0
Epoch: 7500/10000 Loss: 189266130000.0
Epoch: 7600/10000 Loss: 180154430000.0
Epoch: 7700/10000 Loss: 171579100000.0
Epoch: 7800/10000 Loss: 163506320000.0
Epoch: 7900/10000 Loss: 155893020000.0
Epoch: 8000/10000 Loss: 148692430000.0
Epoch: 8100/10000 Loss: 141869370000.0
Epoch: 8200/10000 Loss: 135439640000.0
Epoch: 8300/10000 Loss: 129361560000.0
Epoch: 8400/10000 Loss: 123593610000.0
Epoch: 8500/10000 Loss: 118108680000.0
Epoch: 8600/10000 Loss: 112880360000.0
Epoch: 8700/10000 Loss: 107913210000.0
Epoch: 8800/10000 Loss: 103188360000.0
Epoch: 8900/10000 Loss: 98686030000.0
Epoch: 9000/10000 Loss: 94398145000.0
Epoch: 9100/10000 Loss: 90303500000.0
Epoch: 9200/10000 Loss: 86406760000.0
Epoch: 9300/10000 Loss: 82687780000.0
Epoch: 9400/10000 Loss: 79144060000.0
Epoch: 9500/10000 Loss: 75771310000.0
Epoch: 9600/10000 Loss: 72547975000.0
Epoch: 9700/10000 Loss: 69478646000.0
Epoch: 9800/10000 Loss: 66521575000.0
Epoch: 9900/10000 Loss: 63701450000.0
Epoch: 10000/10000 Loss: 61003633000.0
step6 : Automating the stuffs
Configuration : 5 - Upto Pooling Layer 4
D:\LEARN\Deep Learning\3. Style Transfer Paper implementations\Texture generation from Image using CNN\tensorflow_vgg\vgg16.npy
npy file loaded
build model started
build model finished: 0s
[]
All layers' outputs have been computed sucessfully.
D:\LEARN\Deep Learning\3. Style Transfer Paper implementations\Texture generation from Image using CNN\tensorflow_vgg\vgg16.npy
npy file loaded
build model started
build model finished: 0s
[<tf.Variable 'input_noise:0' shape=(1, 256, 256, 3) dtype=float32_ref>]
Epoch: 1000/10000 Loss: 172597660000000.0
Epoch: 2000/10000 Loss: 58709074000000.0
Epoch: 3000/10000 Loss: 26250159000000.0
Epoch: 4000/10000 Loss: 7911461400000.0
Epoch: 5000/10000 Loss: 2571915500000.0
Epoch: 6000/10000 Loss: 1335424300000.0
Epoch: 7000/10000 Loss: 869748830000.0
Epoch: 8000/10000 Loss: 622193900000.0
Epoch: 9000/10000 Loss: 470882400000.0
Epoch: 10000/10000 Loss: 376950520000.0
网友评论