Homepage on MMP | Oscar Pocock https://mmp.oscar.blue/ Recent content in Homepage on MMP | Oscar Pocock Hugo -- gohugo.io Sun, 10 Apr 2022 13:13:48 +0100 Week 10 https://mmp.oscar.blue/posts/week-10/ Sun, 10 Apr 2022 13:13:48 +0100 https://mmp.oscar.blue/posts/week-10/ This week I was busy applying for jobs and working on the CS38220 assignment. So I only managed to do code clean-up and implementing a config file which allows a user to change the order of filters and other settings. Config file example: config.yml 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 --- # Config file for autophotographer # List of filters to apply in order filters: - brightness - filesize - contrast - focus # Whether or not to apply CNN ranking CNNrank: True # Options for focus filter brightness_options: threshold: 0. Week 9 https://mmp.oscar.blue/posts/week-9/ Sun, 03 Apr 2022 13:13:48 +0100 https://mmp.oscar.blue/posts/week-9/ This week I worked on getting focus detection working. I implemented basic laplatian blur detection1 and fast fourier blur detection2. Finding the threshold for both can be a challenge General pipeline # I continued to develop a general pipeline to fit all the filters in https://pyimagesearch.com/2015/09/07/blur-detection-with-opencv/ ↩︎ https://pyimagesearch.com/2020/06/15/opencv-fast-fourier-transform-fft-for-blur-detection-in-images-and-video-streams/ ↩︎ Week 8 https://mmp.oscar.blue/posts/week-8/ Sun, 27 Mar 2022 13:13:48 +0100 https://mmp.oscar.blue/posts/week-8/ Monday # Towards the end of last week ( Week 7), I managed to refactor my code in order to make it more portable. This allowed me to train my model on different machines. I ran my training script on the uni’s GPU compute successfully for 20 epochs. The next stage was to train it for longer and analyse the results. On the Monday morning I adjusted the parameters of my training script to train for 2000 epochs instead. Week 7 https://mmp.oscar.blue/posts/week-7/ Sun, 20 Mar 2022 12:40:18 +0100 https://mmp.oscar.blue/posts/week-7/ Now that I had successfully run my model without any runtime errors, the next step this week was finding some GPU compute so I can train my model on much more powerful hardware to accelerate the training. My first idea was to use cloud computing. There are machine learning specific cloud technologies, but I didn’t want to use these as I didn’t want my code to be dependent on the specific ways cloud platforms want the code in. Week 6 https://mmp.oscar.blue/posts/week-6/ Sun, 13 Mar 2022 12:40:17 +0100 https://mmp.oscar.blue/posts/week-6/ This week I finished programming the basic CNN model using transfer learning. I decided to train it for 20 epochs to make sure there weren’t any runtime errors in my code. As I don’t own an Nvidia GPU (I have an AMD GPU), I couldn’t make use of the pytorch version that utilised CUDA to speed up processing. There is a RocM version of pytorch for AMD GPUs1 but RocM isn’t as mature as CUDA and only officially supports a small subset of Linux distributions. Week 5 https://mmp.oscar.blue/posts/week-5/ Sun, 06 Mar 2022 12:40:14 +0100 https://mmp.oscar.blue/posts/week-5/ Starting to write a CNN # This week I started to implement what I had learnt about CNNs in Week 4. At this point I hadn’t designed a CNN architecture to implement, instead I wanted to have a running model regardless of performance just to see if I could implement one and understand it. Half way through implementation, I decided to look back at the existing research papers on judging aesthetic judgement to see which aspects of their system and CNN were important to the task. Week 4 https://mmp.oscar.blue/posts/week-4/ Sun, 27 Feb 2022 12:46:59 +0000 https://mmp.oscar.blue/posts/week-4/ This week I did some research into how to build a CNN from scratch, including the different type of layers, loss functions, learning rates, epochs and other core concepts.12 I also set up and created this blog with Hugo to document my progress and setup Woodpecker CI to do continuous testing and integration. deeplizard. “Convolutional Neural Networks (CNNs) explained.” (Dec. 9, 2017). Accessed: Feb. 22, 2022. [Online Video]. Available: https://youtube. Week 3 https://mmp.oscar.blue/posts/week-3/ Sun, 20 Feb 2022 12:46:55 +0000 https://mmp.oscar.blue/posts/week-3/ Filming footage # At the start of the week I went into town to film some practise footage to work with later (up until this point I had been experimenting with footage limited by my bedroom walls). I took some basic vertical and horizontal footage of the town - no nature or breach footage yet. Gaining more useful information # I used the footage I had recorded at the start of the week and revised my “filesize” code. Week 2 https://mmp.oscar.blue/posts/week-2/ Sun, 13 Feb 2022 12:46:54 +0000 https://mmp.oscar.blue/posts/week-2/ This week I set up my repositories and starting writing some basic code. Set up # Before starting any coding, I wanted to set up my remote git repositories. I had already decided I wanted the project mirrored over two remote git repositories from different providers as a safety precaution. My intial plan was to use the university’s GitLab instance but as it’s recently been moved behind the firewall it would have made mirroring quite difficult. Week 1 https://mmp.oscar.blue/posts/week-1/ Sun, 06 Feb 2022 12:46:51 +0000 https://mmp.oscar.blue/posts/week-1/ This week is the first week of the project. I researched academic papers, existing code and dataset relating to the topic of determining aesthetics. Papers # Photo Aesthetics Analysis via DCNN Feature Encoding1 - Predicting aesthetic performance using a bespoke CNN solution AVA: A large-scale database for aesthetic visual analysis2 - Making of an aestehtic visual analysis dataset Code # Image Quality Assessment - Convolutional Neural Networks to predict the aesthetic and technical quality of images.