Serverless technologies, such as AWS Lambda, are very handy tools for deploying code which needs to be ran infrequently. Rather than turning on a server which runs continuously, these offer on-demand computing resources. The services are designed to support lightweight tools which perform simple tasks. However, if the lightweight tools require one or two heavyweight libraries, then deploying code to these services can become problematic. But, with a little work, the problems can be worked around and switching to a more complex solution can be delayed.
I recently needed to deploy a simple machine learning script which would run weekly, so I turned to AWS Lambda. The Python script needed to pull some data from a data warehouse, perform some simple calculations, and then update the data warehouse. The script used scikit-learn, and that alone was enough to put it over the AWS size limit. To add to the trouble, it also required a data connector library which was nearly as large as scikit-learn.
Despite having done it for a few years, I’m not a very good climber. Unlike most people, I, for some odd reason, started by climbing outdoors. I learned to set up top-rope anchors and climb the short walls of Whipp’s Ledges. My friends are much better climbers than I am, probably because they are more dedicated to practicing in the gym than I am. However, I may have a slight advantage when it comes to building anchors and rappelling. So when Doug suggested that we plan an outdoor climbing trip to Seneca Rocks where we would get to do some easy trad routes, I was super excited to go.
I might have become slightly less excited after Doug began explaining the challenges associated with these routes. From a technical perspective the routes are very easy. From a mental perspective they can be a little more challenging, at least for a beginner. We would take routes which would leave us quite exposed, and rappels would be long enough to require using two ropes. I’ve lead sport routes of around 80 feet; here we would be looking down 160 to 220 feet. It would definitely be outside of my comfort zone, which is a good place to go sometimes.
I come from a large extended family, and Easter was one of the days we all got together. My grandmother was a very kind and understanding person, so all of the various personalities that make up a large family were welcome in her home. After my grandmother passed, attendance at family gatherings began to wane. I live two hours away from my extended family and was one of the last regular attendees to family gatherings. Since we live so far away, spending time with family was the only thing we did these holidays. So, when that tradition came to an end we needed to find a new tradition that brings our family together.
Perhaps we could make a tradition of connecting with family by getting away and enjoying nature. Last year we decided to go backpacking at Archer’s Fork over Easter. It snowed. So, this year we looked for some alternative to tent backpacking. We found Oil Creek State Park, which has the 12 mile Gerard Hiking Trai and offers camping in adirondack style shelters.
Like most years, my boys and I went on a backpacking trip over spring break. This year we were either going to the Spruce Knob area or to Dolly Sods. Both of these places offer alpine-like highlands that we are not used to seeing in Ohio. Weather would determine the location. The weather at Dolly Sods was forecast to be slightly colder with a slightly higher chance or rain, so they decided to go to Spruce Knob.
We were originally going to leave Akron on Wednesday after work, but we couldn’t make it work. It’s just as well, the drive to Spruce Knob is about five and a half hours. And, the last portion follows a winding, guardrail-less one and a half lane road up the side of the mountain. So not arriving after dark was probably for the best.
I just wrapped up a challenging computer vision project and have been thinking about lessons learned. Before we started the project I looked for information about what was possible with the latest technology. I wanted to know what sort of accuracy (precision and recall) I could expect under various conditions. I understand that every application is different, but I wanted at least a rough idea. I didn’t find the type of details that we needed, so we approached the problem in a way that would give us flexibility to change our approach with minimal rework. We used the Tensorflow Object Detection API as the main tool for creating an object detection model. I wanted to share, in general terms, some of the things which we discovered. My goal is to give someone else who is approaching a computer vision problem some information which may help guide their choices.
The customer’s objective was to get an inventory of widgets sitting on a rack of shelves. The widgets were fairly large and valuable, but for various reasons RFID and other radio based solutions were not an option. So, that is where computer vision came in. Using computer vision to solve this problem was not going to be an easy task. There were many obstacles to overcome, and I will discuss them in turn.
I previously wrote about setting up Tensorflow for object detection on macOS. After getting everything set up on the Mac I very quickly decided that it would be worth it to get Tensorflow running on something other than my main development computer. Running Tensorflow to train computer visions models on my Mac consumed all available computing resources. Nothing else could be done while the training was in progress. And, it was not taking advantage of the GPU. Using a dedicated Ubuntu machine with a GeForce GTX 1060 graphics card would be a much better option.
It took a lot of work to get a GPU enable version of Tensorflow installed and running properly. Then, after it was working for a few months, a kernel update caused it to suddenly stop working. I didn’t immediately know that the problems were caused by a kernel update, I though some other updated dependency was the culprit, so I didn’t just roll the kernel back and call it a day. Again I had to spend a good amount of time piecing together different references in order to get everything working properly. I documented everything in one place to make the process easier in the future.
The real moral of the story is probably that it is worth it to use cloud-based compute resources for these sort of tasks. That is especially true if the task allows for TPUs to be used. Regardless, if you have an unused gaming or mining machine sitting around and want to get Tensorflow running on it, this is how I did it.
Tensorflow is an amazing tool, but it can be intimidating to get it up and running. I wanted to help someone get started with Tensorflow on a Mac, and if I had written all of this down then I could have just given them a link. This post isn’t a captivating read, it’s basically a list of commands, but it allows me to easily share my setup instructions. I’ll try to give a short explanation of what needs to happen and then a block with required commands. If you are in a hurry, don’t care to know what is happening, or have an odd blind trust in me, then you can skip the explanation and type in the commands at the end of each section. Conversely, if you follow along in the explanation section, then you don’t need to re-type the commands that appear at the end of each section.
There are many different applications for Tensorflow. We are currently using Tensorflow for object detection, so these instructions install everything which is required for that task. Other task could require additional prerequisites. If you are still here, then you probably need to get Tensorflow Object Detection API set up. In that case, open up a terminal window and read on.
On our first climbing trip, in the still very frigid month of March, a group of six of us went to Muir Valley. For our second trip, in the surprisingly sweltering month of May, four of us made the journey to the Red River Gorge. For our last trip of the year, our group had dwindled to three. The rest of the group was, respectively, busy taking care of a new human, finishing a doctorate, and settling in to a new job. Which, from a climbing perspective, was a shame. The weather for this trip was just about perfect. It was cool with low humidity, and the remnants of Hurricane Willa passed to the north of Kentucky, yielding only mild showers on our travel day.
The only downside to beautiful weather is that everyone likes to get outside and enjoy it. I would guess that the number of climbers out on any given day is somehow proportional to the quality of the weather. Fortunately, there is so much accessible climbing in the Red River Gorge that, with a little planning, it is not hard to find a wall to climb. And, great weather opens up routes which are less desirable under cold or raining conditions. Our unofficial guide, Doug, picked some crags in the Pendergrass-Murray Recreational Preserve (PMRP) which seemed likely to give us good climbing opportunities.
Last summer we drove from Akron to Fort Collins, Colorado. Although it was a great experience, we wanted to drive a little less this summer. So we came up with a new adventure idea. We wanted to find a place which was off the grid, but had front-country amenities, like running water, toilets, and great food. And, it had to be within about eight hours of Akron. It seemed like an impossible ask, and I was fairly sure that we would have to compromise on at least one aspect. Then I found Charit Creek Lodge in Tennessee. Amazingly, it has all the desired amenities and is just seven hours and fifty minutes away. As an added bonus, it costs about the same as a stay at a major hotel chain.
Rather than driving straight to Charit Creek, we decided to break our trip up. We were going to do a mix of backpacking at Zaleski, car camping at Cumberland Lake, and lodge camping at Charit Creek.
On this site I write about, what I like to believe is, a diverse set of topics. The normal way of presenting posts using a sequential list does nothing to help people discover other material on the site which they may also be interested in. I wanted to provide visitors with a list of links to content which is similar to the page they are currently viewing. However, due to limitations in the platform I’m using, there was no option to simply turn this on. So, I wrote some code and implemented an algorithm to solve this problem.
This details the hardware design for a simple 12-bit microporcessor. I created it for an undergraduate class which I took a few years ago. It is not really usefull for anything besides learning how computer hardware works, but I still think that it is pretty cool. I found the documentation for it on my hard drive and remebered how proud I was to have actually completed it; I am a computer scientist, not a computer engineer. Simple logic gates are used as the basis for the creation of more complex digital electronic circuits; those circuits, including a control unit, are in turn connected via a datapath to form a completed processor. The processor datapath is designed to implement the Simple-12 instruction set.
Although their use is slowly fading due to society’s increased reliance upon computers, the ballpoint pen is still used on a daily basis by most people in the United States. What is now an inescapable piece of disposable technology began its life as nothing more than an expensive and seemingly short-lived fad. Popular media accounts from the mid-1940s track the ballpoint pen’s rapid initial increase in popularity followed by its similarly precipitous drop. After this initial popularity spike the media chronicled the ballpoint pen’s gradual rise from novelty to ubiquity.
On a trip to Argentina in the summer of 1945 a businessman from Chicago named Milton Reynolds discovered a fascinating pen that he was certain could be a commercial success in the United States. Reynolds brought some of the pens back to the United States with him, and within a few months they were being mass produced by his newly formed company.