During my MA in Social Research at the University of Leeds, I was assigned to do an independent project as part of the Programming for Spatial Analysts: Advanced Skills module. As the assessment was an open project, I decided work on a project that I'm currently interested in; Tweet scraping using Python.
This project contains two parts:
Part 1: Building a script to stream live-Tweets on a topic, and storing the data into a MongoDB collection. Then converting the data into a CSV file for analysis.
Part 2: Building a set of tools containing scripts to analyse the Twitter data. Scripts built allow the following:
For the first assessment of the Advanced Programming module, practicals were provided for Data Processing and Data Analysis using Python and ArcGIS. We were taught the basics of how to use Arcpy, databases, XML, Numpy, Pandas, Bokeh, and NLTK.
To access my output from the practicals with more information, click this link to be taken to the corresponding GitHub repository.
Guide to the GitHub Repository Structure
This is the second assessed project for my coding module using Python. This agent-based model has been coded without assistance or practicals to demonstrate how well I can code. The project's environment is a city, which simulates drunk people attempting to travel home.
Key Model Notes
Here you will find the link to my project work for an agent-based model that was built following practical sessions for the assignment named above. The module is part of my MA Social Research (Interdisciplinary) programme, which has taught me the Python language, allowing me to utilise this skill during my PhD at the University of Leeds.
Key Model Notes