Date of Submission

Fall 11-27-2018

Degree Type


Degree Name

Master of Science in Computer Science (MSCS)


Computer Science

Committee Chair/First Advisor

Dr. Selena He


Big Data


Dr. Dan Lo

Committee Member

Dr. Ying Xie

Committee Member

Dr. Sumit Chakravarty


The evolution of machine learning and computer vision in technology has driven a lot of

improvements and innovation into several domains. We see it being applied for credit decisions, insurance quotes, malware detection, fraud detection, email composition, and any other area having enough information to allow the machine to learn patterns. Over the years the number of sensors, cameras, and cognitive pieces of equipment placed in the wilderness has been growing exponentially. However, the resources (human) to leverage these data into something meaningful are not improving at the same rate. For instance, a team of scientist volunteers took 8.4 years, 17000 hours at a rate of 40 hours/week to label 3.2 million images from the Serengeti wild park. For our research, we are going to focus on wild data and keep proving that deep learning can do better and faster than the human equivalent labor for the same task. Moreover, this is also an opportunity to present some custom Capsule Networks architectures to the deep learning community while solving the above-mentioned critical problem. Incidentally, we are going to take advantage of these data to make a comparative study on multiple deep learning models, specifically, VGG-net, RES-net, and a custom made Convolutional-Capsule Network. We benchmark our work with the Serengeti project where Mohammed Sadegh et al. recently published a 92% top-1 accuracy [23] and Gomez et al. had a 58% top-1 accuracy [12]. We successfully reached 96.4% top-1 accuracy on the same identification task. Concurrently, we reached up to 79.48% top-1 testing accuracy 33on a big, complex dataset using capsule network, which out-performed the best results of Capsule networks on a complex dataset from Edgar Xi et al. with 71% testing accuracy [8,33,27].