1 line
17 KiB
JavaScript
1 line
17 KiB
JavaScript
|
'use strict';(function(){const t={cache:!0};t.doc={id:"id",field:["title","content"],store:["title","href","section"]};const e=FlexSearch.create("balance",t);window.bookSearchIndex=e,e.add({id:0,href:"/docs/developer/",title:"Developer Documentation",section:"Docs",content:"This is the developer documentation. (Work-in-progress)\n"}),e.add({id:1,href:"/docs/user/",title:"User Documentation",section:"Docs",content:"This is the user documentation. (Work-in-progress)\n"}),e.add({id:2,href:"/posts/week-8/",title:"Week 8",section:"Blog",content:"Monday # Towards the end of last week ( Week 7), I managed to refactor my code in order to make it more portable. This allowed me to train my model on different machines. I ran my training script on the uni\u0026rsquo;s GPU compute successfully for 20 epochs. The next stage was to train it for longer and analyse the results. On the Monday morning I adjusted the parameters of my training script to train for 2000 epochs instead.\nTuesday # Tuesday afternoon the training had finished and I had a model that was trained on 2000 epochs. This gave me a day to analyse the results and do some rough predictions before my mid-project demo on the Wednesday.\nTraining and validation loss graphs # As we can see from the 2000 Epochs graph, the loss seems to plateau at around 60 epochs. The training loss seems to even out with more accuracy than the validation loss. This means that our data isn\u0026rsquo;t fully learning what I want it to. Also it\u0026rsquo;s overfitting slightly as it\u0026rsquo;s better at predicting the training set than the validation set. The variance in the validation set shows that the features it\u0026rsquo;s decided to learn aren\u0026rsquo;t the right features to confidently predict aesthetics in this dataset.\nFor the rest of the day I worked on my prediction script so I could use the model to predict new pictures. I also worked on my architecture diagrams and slides for the mid-project demo.\n Due to the nature of how I processed my images (resizing them to 32x32 and then saving them to a tensor then saving them to disk), my prediction script also displayed those down-sized images. This may have also effected the performance of the model.\nWednesday # I spent most of Wednesday morning finishing my slides, diagrams and making example predictions using the prediction script.\n Rest of week # I spent the rest of the week looking at the project\u0026rsquo;s overall pipeline including the non-machine learning filtering. I also started to implement basic focus detection by looking at blur detection using the Laplacian operator1.\n https://pyimagesearch.com/2015/09/07/blur-detection-with-opencv/\u0026#160;\u0026#x21a9;\u0026#xfe0e;\n "}),e.add({id:3,href:"/posts/week-7/",title:"Week 7",section:"Blog",content:"Now that I had successfully run my model without any runtime errors, the next step this week was finding some GPU compute so I can train my model on much more powerful hardware to accelerate the training.\nMy first idea was to use cloud computing. There are machine learning specific cloud technologies, but I didn\u0026rsquo;t want to use these as I didn\u0026rsquo;t want my code to be dependent on the specific ways cloud platforms want the code in. Instead, I wanted to get a general VM with an attached GPU where I could run my workloads manually. I had already written docker images that contained all the depencies of my code that I could deploy to these VMs to ensure a reproducible and portable environment.\nFirst place I looked was Linode. Although, after I contacted their support about it they said I needed at least $100 of transactions on my account in order to request access to their GPU instances. They also noted I could make a deposit of $100 to start using them straight away. I wasn\u0026rsquo;t sure if my model was going to use up $100 to train yet so I didn\u0026rsquo;t want to risk it.\nI then looked to Microsoft\u0026rsquo;s Azure. I had used Azure during my industrial year and had previously passed the fundamentals and associate administrator exams
|