Posts

Week 14 Progress Update

Week 14 Progress Update     This week's blog is going to be very short as I made literally no progress. This week I was planning on continuing to analyze data but unfortunately I was unable to gain anymore data due to my computer running unfathomably slow. As such I have simply been trying to speed up my computer over the last week all of which is menial tasks meaning I won't have any images or anythin to really show. I think it is now at a manageable speed meaning that by at least the middle of the week I will be able to continue with my plans, unfortunately this means I will have lost a week of data collection and analyzing which is rather frustrating.

Week 13 Progress Update

Image
Week 13 Progress Update     This week was mainly used to analyze the sessions that i have run so far. I came up with a system where I apply the weights to three different images, a checkered pattern, black and white noise, and a sample from the cifar10 database, more specifically a chicken. Applying the weights to these three images allows me to see the patterns that these filters "learn" to "look for".  While I don't have a particularly precise method of analyzing these new images, I have developed a system that allows for semi-consistent analysis of the images that can be compared across sessions. This system is in the form "x/n property (c/10)" where x is the number of images that appear to have the given property out of the total number n of filters in that layer. I say that these appear to have this feature with certainty c/10 where 10 is very certain and 1 is not certain at all. The document for one of my sessions is shown below.

Week 12 Progress Update

Image
Week 10 Progress Update     This week I've made some significant progress, I finished my program to show the convolutions as applied to specific images which should help me significantly. I haven't been able to start analyzing anything yet but even from a few tests of my program I can tell that it will be extremely helpful. The program is integrated into the program that displays the actual weights themselves so that I can run it and view everything at once. I also slightly modified my naming conventions to include batch size because I'm planning to vary the batch size in order to increase accuracy as well as get some different angles.  above: The code that graphs the convolutions as applied to an image. below: The weights from the first layer of a batch size 10, 10,000 image run as applied to an image of a bird (chicken)  

Week 11 Progress Update

Image
Week 11 progress update     This week I started to record and analyze the output of the neural network. Using the saving feature of my neural network I ran the network twice, first on 10,000 images and then on 30,000 images. I saved the weights, naming them used on a system as follows, (# of images)_run(run #), for example 10k_run1. I made a program that output the weights in a grid with a square representing a weight, with a high weight being red and a low weight being blue as shown below. I then began adding a function that could apply the weights to an image so I could better analyze it. I haven't quite completed this yet but as soon as I do I should be well prepared to analyze the weights after I have collected a few more samples. I mighr additionally make a program to average together several weight matrices from samples of the same size so that I can get a more accurate general example. below - the first and second level weights for the 10k run and 30 k run  

Week 10 Progress Update

Image
Week 10 Progress Update Its been a while since I've posted an update and as such its not actually week10 anymore, however I have decided that it will make more sense if I just continue the current numbering. Over the past few weeks I have accomplished a lot. First of all, I got the convnet working. Instead of posting tons of pictures of my code I'm just going to upload it to Github and it will be visible at this link. This code will also contain all of the later updates I made to the code. Other than making the CNN work I implemented two main features, saving and loading kernels, and testing custom images. The kernel saving and loading lets me view the convolutional weights as well as allows for makeshift loading which allows for non-continuous training sessions, however, it is not meant for this and I'm not planning on using it for this because it could cause issues with accuracy. The custom image testing is not really important but it does give me a more accesible acc

Week 9 Progress Update

Image
Week 9 Progress Update     This week I spent a large amount of time preparing my code to be partially rewritten. I discovered that I actually have to use a slightly different input pipeline and that implementing this pipeline will require me to change a large part of the code. One of the things I often try to do before making big changes to code is clean it up so it is more readable and optimized. In cleaning up my code I changed my data processing code for the fifth time, this time including code that allowed me to make sure the images were valid as well  as processing the binary as python lists before converting it to tensors. I also add some checkpoints in my code to make sure everything was running properly. After all this I am about ready to edit my code and be done with this segment of the research, however I am a little behind task on what I was planning to accomplish, because of this I am planning on implementing only one or two visualization methods instead of the four I ha

Week 8 Progress Update

Image
Week 8 Progress Update Progress is relatively slow, I debugged to the point where it has an output, unfortunately the output is erroneous. Along with this, the program outputs an error after running the optimization loop after only one run. I have been stuck on this problem for the past few days and if I can't seem to fix it by the end of next week I'm going to start looking at visualization algorithms and coming back to the actual CNN later.  Error given after first optimization (above), and output from first optimization (below)