Hi, I'm Isak!
I'm a software developer, living in northern Sweden. Where I spend my time building clever things and cool experiences.
I'm currently located in Luleå, Sweden. Where I studied Software Engineering at Luleå University of Technology.
During my time as a student I participated in various programming related events, and took on a couple of freelance jobs, mainly focused around English-Swedish translation and small scale programming projects.
In my free time, I enjoy reading, gaming and occasionally some tennis. One of my favourite things to do in my spare time is to attempt to automate the boring parts of life!
I'm fluent in both English and Swedish.
I'm fluent in both English and Swedish.
I'm fluent in both English and Swedish.
TiaBot is a Discord chatbot, written in Java and Python.
Functionality includes accessing various 3rd party API in order to present information in response to queries, performing small scale statistical analysis, tracking and managing information for a community voting system, and several meme related responses of varying complexity.
Echo | The bot can echo messages set in a private message to a set server. Users can also set it up to respond with custom phrases to their messages. |
---|---|
Soundboard | Play back of various sound files through voice chat, in response to Discord messages or network traffic to my computer. |
Game announcing | Verbally announces various game events to help players track and react to in-game events. |
Text-to-Speech | Lets users utilize the bot as an interface to use and playback TTS audio through voice chat. It's mainly used by players that tend to be a bit shy of using a microphone or as jokes by having it attempt to pronounce odd words in unusual accents. |
Cleverbot integration | Lets the users have group conversations with Cleverbot. Usually used during downtime in-between games etc. |
Voting system | Uses Discord's emoji reactions to let the users quickly input reactions as votes. |
XIVDB Search | Accesses XIVDB's API to retrieve item and crafting information for items. |
DPS Parses | If my computer is on and I'm connected to the game, users can request that the bot post a graph with the groups damage per second. |
Stat parsing | Converts character stats into human understandable percentages and multipliers (ex 1000 direct hit = 20% chance to hit, and a 1.45x damage multiplier on hit). |
Character retrieval | The bot can retrieve and post an image of what someone's in-game character looks like. |
Numberwang | The bot contains a playable version of the fictional game "Numberwang". Where it will score and track all users on each server (score is printed on request). |
It was originally written as a TeamSpeak bot several years back where it initially served as a sort of soundboard integrating with the chat system of a game, by monitoring the network traffic to my computer.
Later it expanded to also watch for and respond to various in-game events. Google's Text-to-Speech engine was used to verbally announce things such as upcoming boss attacks, player deaths and cooldown timers.
The eventual addition of a function that let users use the TTS function to make the bot say whatever they wanted in any accent made it quite popular in the community and I eventually moved it to a cloud based server to maximize it's uptime.
Due to various circumstances, the community eventually moved from TeamSpeak to Skype, and I rewrote the bot so it could follow. With Skype being more text-based compared to TeamSpeak, I expanded functionality to also include pulling and presenting game data, integrate with Cleverbot for amusing conversations (also works with TTS), and help organize group sessions.
Over time, the community has since migrated to Discord where TiaBot followed. I still maintain the bot, and occasionally add new features.
At its peak, TiaBot was actively used by approximately 250 users across 5 different servers.
VR Classroom is an educational app targeted towards middle-school students.
It's intended to make use of the unique possibilities offered by virtual environments in order to better convey various educational concepts to children.
The application's content was designed by using the Swedish National Agency for Education's guidelines for the knowledge a child should have attained around their first few years of schooling as a base.
In addition, we also worked closely with teachers and teaching students to formulate the level design and experiences in the application.
Some of the first feedback we received from the teachers was concern about the cost of a VR setup like the HTC Vive or the Oculus Rift, and so we decided to limit the project to Google Cardboard. This also meant that the user interactions had to remain relatively simple, but also that the graphical direction of the app had to take the resource limitations of a phone into consideration.
One of the primary goals with the application was to explore the unique possibilities offered by VR. Therefore, most of the design is centered around visual and spatial experiences.
One of our unofficial guidelines was to "Create experiences that normally wouldn't be possible in a classroom environment".
We identified X main areas that we believed would benefit the most from VR technology.
Geometry
Geography
Spatial reasoning
In addition, we also decided to experiment with using VR to increase engagement of a traditional subject. In this case English.
Geometry is a subject where many students find it challenging to initially grasp how the dimensions of geometric shapes affect the volume and surface area of a shape. Especially when attempting to depict 3D objects on a whiteboard, or illustrate changes in scale.
Given the nature of VR, we decided to focus on experiences that helped students gain an intuitive understanding of scale and 3D geometry.
Since the tasks are targeted at younger children, extra care was taken to emphasize encouraging feedback and soften the psychological impact of failure.
Since scale is one of the things that children (and even adults) often find difficult to understand at large values, and it also happens to be one of the things that often generates a "wow factor" in VR, we decided to start the geometry section off with an overview of scale as a concept.
Most children have a rough understanding of how far away their school is from their home. As such, we decided to use maps as the base for describing scale.
The user would start off by standing on top of a small map, while receiving a verbal lesson in how scaling of objects works. At an appropriate point in the lesson, the map will start growing until it reaches a 1:1 "real" scale.
After the introduction to scale, the user is then taken through a series of randomly generated interactive tasks to complete.
The user is instructed to adjust the scale of two cubes so that they match.
This is the simplest of the tasks, where the user gets to familiarize themselves with the Google Cardboard controls and make sure they understand how scale works.
Here the user is presented with a series of differently scaled cubes and asked to identify which cube corresponds with a certain scale of another.
The difficulty of this task is adjusted based on how well the user did in previous attempts. Repeated success will result in smaller differences in scale, while failure to identify the answer means that more obvious differences will be used.
If a very large amount of failures occur, the user will eventually be sent back to repeat the introduction section.
When first introduced to 3D geometry, many students fins it difficult to intuitively understand how different measurements affect the volume of the objects. Partially due to the 2D nature of the whiteboard. This is a task that VR is uniquely suited to tackle.
The user is presented with a transparent glass container and asked to fill it with liquid. They are given the volume or measurements of the container, depending on the difficulty (based on previous attempts success/failure rate), and asked to fill it to the brim with liquid.
The primary goal here, is to give the students an intuitive understanding of how large the objects are.
In order to make the failure of this task a bit more fun, the container will actually overflow and cover the floor with liquid if the student accidentally puts too much liquid in it.
Very young children are still learning to understand and describe the relative placement of objects, such as placing something over, under, next to, etc. We designed this portion of the application to let them practice and learn to correctly place objects in relation to each other.
The user is given randomly generated instructions on a virtual blackboard, asking them to decorate the room.
The instructions can include:
Once the user is finished decorating the room, they select "Done" and get visual feedback on how they did.
The goal in this section is to attempt to leverage the novelty of virtual reality in order to increase student engagement.
The fundamental task presented to the students is to translate a word from English-Swedish, or Swedish-English.
In this task, the user is simply presented with a rotating 3D model of an object, and asked to identify the English word for it from a multiple choice UI.
By using novel objects, such as life-sized lions or airplanes, as well as shifting the users location between tasks, this turns what is normally a relatively boring task into a novelty.
Here, users are asked to spell out the translation of a word. The twist is that the way they spell it is by "shooting" the letters that are raining down from the sky around them.
By having the letters rain down all around them, this turns the task into a full body experience where the user has to constantly rotate back and fourth to locate the correct letters.
This task is best done on a swivel chair.
TBbard is a Java application designed to parse MIDI files, convert them into a format suitable for playback within the MMORPG Final Fantasy XIV. And finally, automatically play the MIDI file in-game.
Aside from properly processing the MIDI files, the game has several limitations that had to be worked around in order to properly play back midi files.
The application has three primary functions.
Making this work is more complex than it appears at first glance. And some compromises had to be made in order to fit the music to the in-game playback system.
The UI is designed using Swing, in a grid layout to maximize the ease of adding future features.
Along the top of the UI sits a settings section, with various toggles and spinners for controlling playback. The main focus of the UI is the instrument selector, along with the textbox containing the actual script.
The primary challenge when designing the application was figuring out how MIDI files worked, as well as discovering the various limitations of the game's playback system.
The playback system in the game is much more limited than what can be encoded in a MIDI file. Some of the primary limitations that had to be taken into account are:
When the in-game playback system was first introduced, it had several additional limitations. Accessing higher and lower octaves required an additional button press (holding ctrl/shift), and playback was further limited by server latency.
The additional button press meant that it would take an additional frame for the octave shift to happen.
The server latency also meant that playback was further slowed down and some songs became completely unrecognizable.
Both problems were partially solved at the time, by adding a playback multiplier that would slow all wait times by a certain factor.
Another issue that was only discovered through user feedback was that the drag-and-drop feature in windows doesn't work if User Account Control is enabled. This is because TBbard needs to be run as admin in order to send artificial input to the game, and Windows Explorer generally isn't run with admin privileges.
Over the years I've written a large amount of web scrapers. I originally started writing them in order to free up time for myself by having the scrapers compile daily newsletters for me.
The scrapers cover many different sites of varying complexity. Most of the scrapers are written using Java or BeautifulSoup in Python and set to run at different intervals as cronjobs on a cloud-based server.
This was the very first scraper I wrote. I figured that I could save a lot of time by getting a daily news compilation every morning instead of wasting time browsing reddit by myself.
From a time saving perspective, this has probably been one of the best things I've done!
The scraper works by accessing a SQLite database with the subreddits that I've set it to watch, and then using the reddit API, it pulls all the sufficiently upvoted threads. Finally, it compiles the data into a human readable email and sends it to me.
This has been run flawlessly every morning for the past ~6 years, to the point where it's become part of my morning ritual to sit down with a cup of coffee and read my "Daily Reddit Parse".
Similarly to the Reddit scraper, this scrapes and compiles a daily newsletter that's sent to me every morning.
This scraper monitors a large amount of RSS feeds and emails the content on updates.
It uses SQLite to keep track of feeds and differentiate between feeds that should be email immediately on an update, and feeds that should be saved and emailed the next morning as part of a compiled newsletter.
For some feeds, I'm only interested in certain subjects, and so the scraper also has filters where it will ignore (or only read) entries that contain certain phrases.
I've written several scrapers that crawl different kinds of blogs to compile web serials into e-reader friendly epubs.
I've also written countless smaller and less interesting scrapers over the years.
This one sends me PDF files with snapshots takes of nearby convenience stores (using wkhtmltopdf).
Sometimes, the internet goes down or there's upcoming maintenance. This is generally announced on the ISP's webpage, and is easy to miss or annoying to navigate to using a phone with LTE when trying to figure out what happend.
So I wrote a small scraper to monitor their site and notify me when it changes.
I have a small army of scrapers monitoring various sites for sales and deal of games and software.
Some monitor changes on the resellers own website, while others monitor forums that generally cover those kinds of sales.
Several scrapers monitor various sites for new game releases to notify me on their release.
This bot has been in continuous development over the past several years as the game has developed. It provides active assistance in combat scenarios, automates several of the game's more monotonous tasks and notifies the user of various events through audio notifications as well as push notifications when afk.
Over time, the functionality of the bot has changed and evolved alongside the game. As such, some of the original functionality is no longer relevant or in use.
This project was conceived as a response to a certain boss fight in the game, where the first 1-2 minutes of the fight played out identically every time but still required a frustratingly large amount of focus to clear correctly. At some point I realized that it played out so similarly each time that I could just automate it and the consistency of the bot would take a lot of the load off of my mind while at the same time have my teammates get comfortable with the fight quicker.
Long term, the bot evolved to also cover more general scenarios than just that one boss. It's been in active use for the past 3 years, and certain abilities and actions in the game are now completely controlled by it.
By intercepting my own keypresses, and making use of audio feedback, I've managed to make the bot feel like a natural part of the game and it's very difficult to imagine playing without it anymore.
VR Remote is a project to investigate the feasibility of using virtual reality as a remote control medium. In many systems using remote control, it's decided to control equipment remotely because the environments that they are operating in are often times dangerous. Not just to people, but also the equipment.
One of the key challenges remote operators face, is attempting to understand exactly what the terrain looks like around the remote machinery. In today's systems, this is generally done using traditional 2D cameras and monitors. By using VR, it would be possible for the operators to experience the terrain in 3D! The addition of depth perception would make it easier to operate the machinery, and thus minimize errors and equipment damage.
We designed a system using an Oculus Rift and keyboard as a control interface. We used a Microsoft Kinect to scan the 3D environment, and to provide course correction using a modified Kalman Filter. Finally, as a stand in for the remote equipment, we used a Bluetooth controlled Sphero ball.
The Sphero was chosen since it allows for remote control movement, as well as the ability to change colors as a stand in for various remote equipment actions.
The main challenges was getting the Sphero Bluetooth driver functional and then designing the modified Kalman filter to properly compensate for drift.
An unexpected issue with the system was that users suffered rapid motion sickness when using the original design, in which the user was seated inside of a virtual Sphero.
After doing some research, we compensated for this by making the Sphero a flat disk on the floor, with the disk outline reflecting the boundary of the real Sphero. This let the user use the disk as a spatial frame of reference when controlling the Sphero, avoiding motion sickness.
As an alternative mode of control, we also implemented a realistic scale mode, where the user could walk around the environment as if they were there. This was less novel, but would undoubtedly be invaluable in real life use-cases.
The Sphero's 3D position was partially calculated based on the estimated movement of ball's input, while long term drift was compensated for by having a Kinect mounted nearby.
Controlling the Sphero was done, either by having the user be seated on the ball, during which they could control it either by facing the direction they wanted to move, or using a keyboard with a small blue indicator for where the Sphero was currently facing.
Alternative, the Sphero could be controlled using realistic proportions, where the user could walk around the room and simply control the Sphero remotely as if they were actually there and using it normally.