diff --git a/content/dev-engineering/index.md b/content/dev-engineering/index.md index 20bf959..59594d0 100644 --- a/content/dev-engineering/index.md +++ b/content/dev-engineering/index.md @@ -13,8 +13,12 @@ Holoportation is an area of research exploring the ability to livestream 3D envi The holograms are captured in the form of a __point cloud__, a cluster of coloured dots that, when presented correctly, can appear like the original object. +{{< figure src="/images/holo-avatar.jpg" >}} + My undergraduate dissertation documented extending the [__LiveScan3D__](https://github.com/MarekKowalski/LiveScan3D) holoportation platform to allow multi-source streaming. The existing capabilities allowed a single scene to be captured in real-time and streamed elsewhere for viewing, multi-source allows multiple independent scenes to be received and composited at the same time (a many-to-one system). +{{< youtube NP0aVjuk5fU >}} + [Read More](/holo) # [Mixonomer](/mixonomer) diff --git a/content/holo/ServerWindow.png b/content/holo/ServerWindow.png new file mode 100644 index 0000000..5db15dd Binary files /dev/null and b/content/holo/ServerWindow.png differ diff --git a/content/holo/Structure.png b/content/holo/Structure.png new file mode 100644 index 0000000..6d344ab Binary files /dev/null and b/content/holo/Structure.png differ diff --git a/content/holo/ballcap.jpg b/content/holo/ballcap.jpg new file mode 100644 index 0000000..a3bc354 Binary files /dev/null and b/content/holo/ballcap.jpg differ diff --git a/content/holo/index.md b/content/holo/index.md index 70af1b2..f0c6654 100644 --- a/content/holo/index.md +++ b/content/holo/index.md @@ -21,12 +21,20 @@ The app works by capturing what is called a [_point cloud_](https://en.wikipedia 3 Research 4 Photoshoot +{{< figure src="Structure.png" caption="Client-server structure of the environment" alt="client-server structure" >}} + # Multi-Source +![pair of subjects facing each other](pair.jpg) + My undergraduate dissertation was tasked with extending the original software to allow _multi-source_ streaming. The current system could stream one scene to one server for viewing. This scene being captured, whether by one camera or from multiple angles, is called a _source_. Multi-source operation allows more than one scene to be composited and displayed at the server or a connected AR client. The development works by including an ID to indicate what source a frame of footage represents. +{{< youtube NP0aVjuk5fU >}} + +###### A couple of recorded sources operating in the virtual space. A third live one is connected part way through + # Mobile AR The main use for a platform like LiveScan3D is augmented reality using a mobile phone or Microsoft Hololens. Although the point clouds are suitable for rendering in both an AR and VR environment, the resolution and nature of the captured perspective is suited well to recreation in AR. @@ -37,12 +45,18 @@ Despite successfully migrating the app to use the ARFoundation library, the app As a result, the app works fine from a network perspective, but there is a purple rendering error instead of a hologram. +{{< figure src="mobile-holo.png" caption="AR app working on iOS, a purple rendering error where the hologram should be" alt="mobile screenshot" >}} + # Research +{{< figure src="ServerWindow.png" caption="Server window with additional statistics including bandwidth and latency as exponential moving average" alt="server window" >}} + As part of my ongoing work with the holoportation research group, I have also conducted experiments into the suite’s network behaviour. The original software was suited well to the lab environment that it was written for but there are a number of limitations that affects its performance over the open internet. For one, it uses TCP for its data transmission; streaming protocols usually don’t use this because of the overhead it incurs among other reasons. The work that I did used a collection of virtual machines hosted in various global locations as an environment to measure quality-of-service stats. Streams were set up over varying distances to see how it affected values like latency and throughput. This led to a channel management system being written that would manually control the throughput of frames in order to prioritise certain operating parameters. The system proved effective and further expansions are being looked into. # Photoshoot +![ballcap](ballcap.jpg) + The system uses a [_point cloud_](https://en.wikipedia.org/wiki/Point_cloud) to capture and transmit 3D video. When zoomed in with a small point size, the medium looked really cool with the black virtual background, see here for more. \ No newline at end of file diff --git a/content/holo/pair.jpg b/content/holo/pair.jpg new file mode 100644 index 0000000..1a7039f Binary files /dev/null and b/content/holo/pair.jpg differ diff --git a/content/mixonomer/index.md b/content/mixonomer/index.md index c85216a..2675bac 100644 --- a/content/mixonomer/index.md +++ b/content/mixonomer/index.md @@ -38,8 +38,7 @@ I wanted to see what an app like this looks like, what it involves to build it. In the process of working on this project, I learnt how to create web servers with __Python’s__ [__Flask__](https://flask.palletsprojects.com/en/1.1.x/) module, how to deploy them to a cloud environment and how to interact with other cloud-based services. The architecture is now completely serverless using __Google‘s App Engine__, __Cloud Functions__ and __Firestore__ services. -![cloud structure](cloud-structure-3.png) -###### Cloud architecture of services in Google’s Cloud Platform +{{< figure src="cloud-structure-3.png" caption="Cloud architecture of services in Google’s Cloud Platform" alt="cloud structure" >}} The front-end was written in __React__, which I also learnt in the process. It was, in fact, my first significant modern __Javascript__ project utilising a __Node__ + __Webpack__ stack, it was interesting getting to grips with the __Js__ ecosystem by making them work together and getting the result to deliver correctly from the backend. diff --git a/content/selector/index.md b/content/selector/index.md index 956dcaf..c02e02b 100644 --- a/content/selector/index.md +++ b/content/selector/index.md @@ -10,9 +10,7 @@ I've been working on my .NET skills recently and, as I tend to, practiced with a Selector is an agent that watches what you’re listening to on Spotify and reacts to changes by firing pluggable events. These include retrieving the current song’s characteristics and play count on Last.fm. This information is displayed on a dashboard that updates live. -![dashboard example](dashboard.png) - -###### The dashboard shows information from Spotify and Last.fm +{{< figure src="dashboard.png" caption="The dashboard shows information from Spotify and Last.fm" alt="dashboard example" >}} The app consists of a ASP.NET web app and a command line service that runs the listening agents. A Redis instance is used for cache and messaging between the nodes. diff --git a/layouts/partials/footer.html b/layouts/partials/footer.html index b5ed9e5..e264bfa 100644 --- a/layouts/partials/footer.html +++ b/layouts/partials/footer.html @@ -1,17 +1,23 @@ diff --git a/assets/images/andy.png b/static/images/andy.png similarity index 100% rename from assets/images/andy.png rename to static/images/andy.png diff --git a/assets/images/holo-avatar.jpg b/static/images/holo-avatar.jpg similarity index 100% rename from assets/images/holo-avatar.jpg rename to static/images/holo-avatar.jpg