adding pages

This commit is contained in:
andy 2022-08-29 17:18:21 +01:00
parent 7a9d86a63e
commit 30bb38fb69
15 changed files with 259 additions and 7 deletions

BIN
assets/images/andy.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 334 KiB

View File

@ -1,8 +1,10 @@
baseURL = 'https://sarsoo.xyz/'
languageCode = 'en-us'
baseURL = 'https://new.sarsoo.xyz/'
languageCode = 'en-gb'
title = 'sarsoo.xyz'
theme = 'hugo-coder'
paginate = 40
[params]
author = "sarsoo"
info = "dev & engineering"
@ -39,11 +41,21 @@ theme = 'hugo-coder'
url = "https://www.linkedin.com/in/andypack/"
[[menu.main]]
name = "Posts"
name = "Mixonomer"
weight = 1
url = "mixonomer/"
[[menu.main]]
name = "Dev & Engineering"
weight = 2
url = "dev-engineering/"
[[menu.main]]
name = "Posts"
weight = 3
url = "posts/"
[[menu.main]]
name = "Contact"
weight = 2
weight = 4
url = "contact/"

View File

@ -1,10 +1,10 @@
---
title: "Contact"
date: 2022-08-21T22:12:40+01:00
date: 2020-12-25T00:04:40+00:00
---
UK-based, award-winning post-grad electronic engineering student & previous Disney intern
UK-based, award-winning post-grad [electronic engineering](/dev-engineering) student & previous Disney intern
Multilingual programmer working from [embedded systems](/posts/iot) to holoportation and full-stack web-dev
Multilingual programmer working from [embedded systems](/posts/iot) to [holoportation](/holo) and [full-stack web-dev](/mixonomer)
I draw sometimes too

View File

@ -0,0 +1,107 @@
---
title: "Dev & Engineering"
date: 2021-01-17T22:59:40+00:00
draft: false
---
# [Holoportation](/holo)
`C++ [Kinect SDK, OpenCV]`
`C# [Winforms, Unity 3D]`
Holoportation is an area of research exploring the ability to livestream 3D environments over the internet. The technology has many applications for __AR/VR__ experiences like 3D sports and music events or smaller-scale applications like a 3D __Twitch__.
The holograms are captured in the form of a __point cloud__, a cluster of coloured dots that, when presented correctly, can appear like the original object.
My undergraduate dissertation documented extending the [__LiveScan3D__](https://github.com/MarekKowalski/LiveScan3D) holoportation platform to allow multi-source streaming. The existing capabilities allowed a single scene to be captured in real-time and streamed elsewhere for viewing, multi-source allows multiple independent scenes to be received and composited at the same time (a many-to-one system).
[Read More](/holo)
# [Mixonomer](/mixonomer)
`Python [Flask]`
`JavaScript [React]`
Mixonomer is a web app for creating smart playlists for __Spotify__. These playlists watch other playlists to use them as sources of tracks. These tracks are filtered and sorted before updating a __Spotify__ playlist.
Updates are run multiple times a day or on-demand. Additionally, __Last.fm__ integration provides listening statistics for your playlists.
![cloud structure](/mixonomer/cloud-structure-3.png)
The project began as an exercise in recreating the functionality of [__Paul Lameres__](https://twitter.com/plamere) [__Smarter Playlists__](http://playlistmachinery.com/) app. This tool had become a really important part of my daily listening habits as a way of combining my smaller sub-genre playlists into larger mixes.
The app has a __Python__ back-end written in __Flask__. The front-end was built using a __Node + Webpack + React__ stack.
The system is now deployed with a fully serverless architecture.
[Read More](/mixonomer)
[Try It Out](https://mixonomer.sarsoo.xyz/)
[Source Code](https://github.com/Sarsoo/Mixonomer)
# [Selector](/selector)
`.NET [ASP.NET, Redis, Docker]`
`TypeScript [Vue]`
A __Spotify__ listening agent which watches what you listen to and presents related data and information in a live dashboard. __Spotify__ presents some interesting track data that isnt visible in the official clients such as its beats-per-minute, key signature and a musical descriptor.
![dashboard](/selector/dashboard.png)
[Read More](/selector)
[Try It Out](https://selector.sarsoo.xyz/)
[Source Code](https://github.com/Sarsoo/Selector)
# Listening Engineering
`Python [scikit-learn, Jupyter]`
__Spotify__ and __Last.fm__ are two powerful platforms for music data and consumption.
I wanted to explore what insights could be found in my 3 years of __Last.fm__ scrobbles when augmented with __Spotify__ data. Ideally, I also wanted to be able to apply the intelligence to the __Mixonomer__ playlist pipeline.
__Spotify__ provides audio features for the tracks on its platform. These features describe a number of qualities for the tracks including how much energy it has and how vocal it is. I investigated whether the set of audio features for my larger genre playlists could be used to classify tracks by genre.
[Read More]()
# Signal Processing
Throughout my studies I found myself particularly interested in the signal processing and AI modules, these have included:
- Computer Vision and Pattern Recognition
- Visual Search Report
- Robotics
- ROS Labs
- Speech & Audio Processing & Recognition
- Linear Predictive Speech Synthesiser
- Hidden Markov Model Training
- Image Processing & Deep Learning
- CNN Training Coursework
- AI & AI Programming
- Shallow MLP Coursework
[Posts](/posts)
[Coursework Code](https://github.com/Sarsoo?tab=repositories&q=coursework)
I've been coding for 7 years and I now work as a software engineer in fintech. Day-to-day this is in [__C#__](/holo/) and [__TypeScript__](/mixonomer) but I also like working with [__Python__](/mixonomer) and [__Rust__](https://github.com/Sarsoo?tab=repositories&q=&type=&language=rust&sort=). I keep all of my projects on [__GitHub__](http://github.com/sarsoo).
Alongside development I also enjoy working on infrastructure, I have 5 years experience using __Linux__ and managing networks. I have experience working with cloud technologies from [__virtual machines__](/holo), [__web server PaaS__](/mixonomer) and [__serverless functions__](/mixonomer) to [__NoSQL__](/mixonomer), Big Data SQL and [__pub/sub messaging__](/mixonomer). Much of this experience was gained during my [__Mixonomer__](/mixonomer) project and during my __Disney__ internship. As part of my [dissertation](/holo#research), I used a global cluster of virtual machines as an environment to measure and experiment with holographic video QoS over long distances.
At university, I was particularly interested in the software side of the field including modules in __programming__, __signal processing__ and __AI__. I also took a set of modules in __semiconductors__ and __nanoscience__.
[Posts](/posts)
# Awards
Throughout my time at university, I earned multiple awards for academic achievement
- ___Deans List for Academic Achievement___ (2018)
- Awarded for overall academic performance as part of my international exchange program with the [__California State University, Los Angeles__](https://www.calstatela.edu/)
- ___Lumentum Award___ (2020)
- Awarded for achieving the __highest mark__ in my third year __Semiconductors & Optoelectronics__ module
- ___Atkins Best Oral Presentation 2nd Prize___ (2021)
- Awarded for giving second best oral presentation for ___multi-disciplinary design project___. Project involved designing a fully renewable ship and depot to repair sub-sea fibre optic cables.

48
content/holo/index.md Normal file
View File

@ -0,0 +1,48 @@
---
title: "Holoportation with LiveScan3D"
date: 2021-01-19T21:49:40+00:00
draft: false
---
[LiveScan3D](https://github.com/MarekKowalski/LiveScan3D) is a holographic teleportation or _holoportation_ platform. The app has a client-server model for streaming 3D or _volumetric_ video over the internet. It was written in 2015 by a pair of academics at the Warsaw University of Technology, [Marek Kowalski](http://home.elka.pw.edu.pl/~mkowals6/) and [Jacek Naruniec](http://home.elka.pw.edu.pl/~jnarunie/).
`Kowalski, M.; Naruniec, J.; Daniluk, M.:`
`
"LiveScan3D: A Fast and Inexpensive 3D Data Acquisition System for Multiple Kinect v2 Sensors". in 3D Vision (3DV), 2015 International Conference on, Lyon, France, 2015
`
The app works by capturing what is called a [_point cloud_](https://en.wikipedia.org/wiki/Point_cloud), a cluster of coloured points. These points act as the pixels of the image with an extra third coordinate for depth. The coordinates of these points are sent with their RGB values; a good enough resolution allows rendering these points to create a decent real-time image for AR/VR streaming. The original version of the software used the Xbox Kinect camera for the Xbox One but it also supports the new Azure Kinect.
## On This Page
1 Multi-Source
2 Mobile AR
3 Research
4 Photoshoot
# Multi-Source
My undergraduate dissertation was tasked with extending the original software to allow _multi-source_ streaming. The current system could stream one scene to one server for viewing. This scene being captured, whether by one camera or from multiple angles, is called a _source_. Multi-source operation allows more than one scene to be composited and displayed at the server or a connected AR client.
The development works by including an ID to indicate what source a frame of footage represents.
# Mobile AR
The main use for a platform like LiveScan3D is augmented reality using a mobile phone or Microsoft Hololens. Although the point clouds are suitable for rendering in both an AR and VR environment, the resolution and nature of the captured perspective is suited well to recreation in AR.
A client AR app was written in Unity 3D by the original authors; initially design for Microsofts Hololens headset, it was modified to target Android devices using Googles ARCore library. From here, I upgraded it to use the Unity-native ARFoundation library. The ARFoundation library abstracts AR functionality away from the device-specific libraries including Apples ARKit and Googles ARCore. Instead, an AR environment can be constructed in wholly Unity components which are replaced by these libraries at compile-time. This was partly a result of the pandemic stopping access to the lab, without the ability to debug on Android devices I hoped to deploy the app on my own phone.
Despite successfully migrating the app to use the ARFoundation library, the app is still not yet working on iOS. This is because the existing graphics pipeline isnt playing well with the iOS Metal graphics API. The app renders a point cloud by spawning objects to act as the points of the frame. Each pixel is rendered using a geometry shader that creates a small coloured square billboard facing the camera. This was a pretty good way to efficiently render the hologram on a mobile device. The problem is that the Metal graphics API used on iOS doesnt support geometry shaders.
As a result, the app works fine from a network perspective, but there is a purple rendering error instead of a hologram.
# Research
As part of my ongoing work with the holoportation research group, I have also conducted experiments into the suites network behaviour. The original software was suited well to the lab environment that it was written for but there are a number of limitations that affects its performance over the open internet. For one, it uses TCP for its data transmission; streaming protocols usually dont use this because of the overhead it incurs among other reasons.
The work that I did used a collection of virtual machines hosted in various global locations as an environment to measure quality-of-service stats. Streams were set up over varying distances to see how it affected values like latency and throughput. This led to a channel management system being written that would manually control the throughput of frames in order to prioritise certain operating parameters. The system proved effective and further expansions are being looked into.
# Photoshoot
The system uses a [_point cloud_](https://en.wikipedia.org/wiki/Point_cloud) to capture and transmit 3D video. When zoomed in with a small point size, the medium looked really cool with the black virtual background, see here for more.

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 122 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 130 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 88 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 269 KiB

View File

@ -0,0 +1,54 @@
---
title: "Mixonomer: Smart Spotify Playlists"
date: 2021-01-19T14:23:40+00:00
draft: false
---
![ci badge](https://github.com/sarsoo/mixonomer/workflows/test%20and%20deploy/badge.svg)
Mixonomer is a web app to augment your __Spotify__ listening experience with _smart playlists_. A _smart playlist_ watches several child playlists and filters the tracks to create new mixes. Updates are run multiple times a day or on-demand.
[Try It Out](https://mixonomer.sarsoo.xyz/)
[Read the Docs](https://docs.mixonomer.sarsoo.xyz/)
![playlist list](Playlists.png)
Include recommendations for additional __Spotify__ suggestions based on a playlists tracklist. Reference your playlists and those you follow by name or add references to other _smart playlists_ to additionaly include their watchlist.
Select library tracks for the playlists to include your __Spotify__ saved tracks in the mix.
![playlist example](PlaylistExample.png)
You can shuffle playlists for output or sort by reverse release date for a thumbnail that stays fresh with new music artwork.
# Tags
Tags are a listening statistics visualiser for exploring your [__Last.fm__](https://last.fm) habits. __Last.fm__ is great for exploring your listening history but sometimes Ive wanted to be able to group some artists or albums to see their data in one place.
![tag example](TagExample.png)
Mixonomers tags lets you do this, I use it for stuff like grouping a labels artists together to see how many times Ive listened to _Dreamville_ artists, for example. Tick time to estimate the amount of time youve spent listening to each.
# Development
I started this project as an exercise in recreating the functionality of [__Paul Lameres__](https://twitter.com/plamere) [__Smarter Playlists__](http://playlistmachinery.com/) app. The tool had become a really important part of my daily listening habits as a way of combining my smaller sub-genre playlists into larger mixes.
I wanted to see what an app like this looks like, what it involves to build it. At this point, I had neither built a web server nor written a significant front-end with a proper framework.
In the process of working on this project, I learnt how to create web servers with __Pythons__ [__Flask__](https://flask.palletsprojects.com/en/1.1.x/) module, how to deploy them to a cloud environment and how to interact with other cloud-based services. The architecture is now completely serverless using __Googles App Engine__, __Cloud Functions__ and __Firestore__ services.
![cloud structure](cloud-structure-3.png)
###### Cloud architecture of services in Googles Cloud Platform
The front-end was written in __React__, which I also learnt in the process. It was, in fact, my first significant modern __Javascript__ project utilising a __Node__ + __Webpack__ stack, it was interesting getting to grips with the __Js__ ecosystem by making them work together and getting the result to deliver correctly from the backend.
The app sits in the background now, it has replaced [__Smarter Playlists__](http://playlistmachinery.com/) for my _smart playlists_ and its been a good testbed to play with more cloud services.
[Try It Out](https://mixonomer.sarsoo.xyz/)
[Github](https://github.com/Sarsoo/Mixonomer)
[iOS Github](https://github.com/Sarsoo/Mixonomer-iOS)
[C# Github](https://github.com/Sarsoo/Mixonomer.NET)

View File

@ -0,0 +1,6 @@
---
title: "Listening Engineering"
date: 2021-02-20T12:22:40+00:00
draft: false
---

Binary file not shown.

After

Width:  |  Height:  |  Size: 132 KiB

25
content/selector/index.md Normal file
View File

@ -0,0 +1,25 @@
---
title: "Selector: A Spotify listening agent"
date: 2022-04-04T21:26:40+00:00
draft: false
---
![ci](https://github.com/sarsoo/Selector/actions/workflows/ci.yml/badge.svg)
I've been working on my .NET skills recently and, as I tend to, practiced with a tool that would integrate with my music listening habits.
Selector is an agent that watches what youre listening to on Spotify and reacts to changes by firing pluggable events. These include retrieving the current songs characteristics and play count on Last.fm. This information is displayed on a dashboard that updates live.
![dashboard example](dashboard.png)
###### The dashboard shows information from Spotify and Last.fm
The app consists of a ASP.NET web app and a command line service that runs the listening agents. A Redis instance is used for cache and messaging between the nodes.
# Last.fm
[Last.fm](https://last.fm) is a service that records what you listen to to give stats and recommendations, Spotify can be linked to record your listening history. Selector includes an integration to present play count data for the current track along with its album and artist on the dashboard.
Along with this, a background agent can be configured in the web UIs settings to mirror your listening history to the local database in order to allow quicker, live statistics querying without constant network calls. This really adds to the data that can be presented live, Last.fm has a decent API, but querying a users entire history takes a while and seriously reduces the depth of insights that can be presented in real-time, especially as this dataset grows with time. By storing a copy locally, the whole history can be queried quickly.
Get deeper insights into what you listen to, [start here](https://selector.sarsoo.xyz).