Trace Labs is a nonprofit organisation designed to help locate missing persons through crowdsourced OSINT. Anyone with a passion to support this cause is welcomed into the Trace Labs community. Trace Labs provides its dedicated community with a number of options to help them in their cause. Every month a number of individuals who are officially missing are posted on a Trello board where anyone can post any information that can be found online that might be helpful to law enforcement in their attempts to locate the individual. However, Trace Labs is predominantly known for it's Capture the Flag (CTF) events. Trace Labs has sensitively gamified locating missing persons by hosting events where for 6 hours teams of up to four players can submit any evidence that they can find online about the missing person that is currently unknown to law enforcement. Last week, the fifth Global OSINT Search Party CTF organised by Trace Labs took place. As a professional Internet Investigator and OSINT specialist, I've known about Trace Labs for a while, however, I'd never managed to sort out my schedule to be able to take part in a CTF before. The Trace Labs Global OSINT Search Party CTF V marks the first of what will now be monthly competitive events and as such it was scheduled to take place at a time that would suit competitors worldwide, taking place between 4 pm and 10 pm UK time. With this in mind, I knew that now was the time to take part, so I quickly signed up for the event. It's a good job that I did because the event sold out; 650 participants from around the world made up 191 teams for the event. During the 6 hour submission period, the 650 participants submitted a total of 8,353 pieces of intelligence, 6,286 of which were approved and will be submitted to law enforcement once consolidated. I had a really fantastic time taking part in the event, even though I was essentially doing my day job. The added atmosphere of camaraderie and the competitive element really adds to the experience, as does the feeling that you're helping to make a difference. The devotion of the Trace Labs team; Robert Sell, Adrian Korn and James Liolios, as well as all the people that volunteered their time as judges for the event, is clear and I'm really happy to see someone finding a way to put the skill of so many to good use. My team ended up doing a little better in the competition than we had expected and we ended up placing third overall, which we were all really happy with. Given that it was a new experience for all of us, I thought that I would put together some of my thoughts on how the team went about the process and share some tips that I picked up along the way.
Forming a team
If you want to take part then don't worry about not being able to find a team. I signed up for the event pretty much as soon as it launched without checking if any of my friends and colleagues were free. The OSINT world is a relatively small and really friendly one that is full of some really fantastic people. I very quickly found myself a spot on a team of professional OSINT specialists who were all new to the competition. It turned out that Paul Brelsford was looking to put a team together for his first CTF and his teammate Alan Hill asked me if I was interested in joining, along with Robert Alexă. If you don't know anyone else competing or you're new to OSINT then don't despair, you can go out and look for a team and find a spot quite easily. Trace Labs has a Slack channel where it appeared quite a few teams were formed. I've also seen teams forming on the OSINT subreddit and on LinkedIn. Don't be afraid to put yourself out there and see who else is looking to form a team, there's bound to be someone in the same boat as you. Working on a team will allow you the opportunity to learn from your teammates and pick up techniques you might not already know, it's a great opportunity to learn from one another within the OSINT community.
Meeting up and making a plan
A week before the event took place my team had a Zoom meet up and this really helped us form our dynamics before the event took place. Although I knew of my teammates from the OSINT community, I'd never actually worked with any of the team before the CTF. There were some pre-existing dynamics; Alan and Paul know each other really well having both worked at the MOD, Alan and Robert currently work together, and I've spoken to Alan before about all things OSINT. However, that was the extent of the previous relationships that existed within our team. Therefore, our virtual get together was a really fantastic opportunity to get to know each other. It allowed us to share our backgrounds and experience within the OSINT field and to discuss what we thought might make for a good strategy, with our limited knowledge of how the CTF's work. We very quickly decided that our first attempt at a CTF as a team was just going to be a bit of fun. We knew that we weren't going to win; the Global OSINT Search Party CTF brings out some of the best OSINT specialists around and a lot of people taking part have previous experience with how the event works. However, we wanted to familiarise ourselves with the process and look to seeing what techniques might support us in future attempts at the competition. We set to work on a strategy that we thought work, aware that we might have to adapt it mid-competition. We'd seen the Trello boards that are used by the Trace Labs community and decided that we would mirror these and populate our own versions during the CTF. We also decided that we would rotate intelligence submission and liaison with the judge every 90 minutes so that everyone had a turn.
Sticking to the plan?
We had no idea what a winning strategy looked like and we felt that as the competition started. It turned out flexibility was going to be necessary for our first CTF; we pretty much threw our strategy out the window after identifying that it wasn't ideal for the time-critical nature of the competition. We all started out by populating our Trello boards, as we had agreed. However, we realised there was going to be a bottleneck where one of us would have to submit everything and it would get really confusing when we switched who was leading submissions. We were also duplicating our Intelligence submissions by keeping them in two places. So about an hour into the competition, we decided to scrap the Trello boards. You can keep track of what you've submitted on the submission page set up by Trace Labs, however, I found it easy enough to keep track of what I'd submitted mentally. The rotation of submission of intelligence and communication with the judge that we had planned also didn't play out. We decided that it was far easier to each just submit the Intelligence that we had found ourselves, as we would be in a better position to justify our case if the submission got rejected. There were also a lot more teams than judges, which understandably makes it difficult for judges to grade each team in real-time. We ended up not getting assigned a judge until about three hours into the competition. Given this was our first time, we had no idea if our submissions were being submitted in the correct way. When our first submissions were looked at by a judge the first two were rejected and one was accepted. So after 3 hours, we had 50 points. Thankfully, the scores started to come in quickly for us as more of our submissions were accepted. My advice to anyone in this situation is to not panic. We didn't waste time chasing for a judge, we trusted that everything would be looked at in time, as Adrian Korn had assured us before the event started. We also trusted our own abilities, whilst nothing was being approved it also wasn't being rejected, so we just carried on and hoped for the best.
Dividing the workload
CTF V consisted of 8 missing persons to conduct activity against. As there were 4 team members, we decided to take 2 each. We each picked two names at random without looking at the additional details. We later identified that two of the eight subjects had been missing for a long time period and we were unlikely to find much about them on social media. I had one of these subjects, who had been missing since about 2003 and the only photos of her available were of her as a child and a photofit of what she might look like as an adult. This was going to make finding anything on her quite difficult, with only the photofit of what she might look like, her name, and two possible locations she might be in. I had a quick look and found about 10 Facebook accounts of women with her name in those locations. Identifying if any of these were her was going to take time and would be speculative. So instead, I decided to spend most of my time on my other subject, who I was faring far better with and so thought was going to be a lot more fruitful due to his greater online presence. Towards the end of the competition, once I had found everything I could on my subject, I then started looking at the subjects that we hadn't prioritised, trying to scoop up some quick points for the team. In total, I submitted about 23 pieces of intelligence on my primary subject and maybe 5 pieces of intelligence on other subjects, which amounted to roughly a few thousand points for the team. Unfortunately, I forgot to check my exact figures before the end of the competition, so I've had to work it out from memory. Something I would love to see in future is a personalised counter of how submissions were accepted and how many points you have scored individually, although I can understand why it might not exist. If this could be seen by the individual only, to save any risk of throwing off team dynamics, then I can see this being a great opportunity for you to track your development as the competitions take place.
Quality before Quantity
Out of all the teams in the top fifteen, we had the lowest number of submissions. Within the six-hour period, my team had 75 pieces of intelligence approved, whilst the average for the top fifteen was 100 pieces of intelligence. My teammates and I all took different paths and approaches to collect our intelligence, however, we were all looking for quality information rather than just looking for as many pieces of intelligence as we could find. I initially identified a few close friends and family members of my subject, however, their combined score wasn't massive and after I had found the closest friends I felt like I could find more important information for law enforcement by digging deeper rather than further establishing their network. So, I pretty much gave up on their social network. I decided instead to go for the greater point scorers that law enforcement may not be able to find as easily. An example of this was scoring 450 points through 3 submissions relating to the subjects phone history. My teammates also went for a similar strategy. We ended up getting quite a lot of points from just looking at the photos of the subjects and using the context to identify lifestyle indicators to assist us with our investigations. Paul was able to earn us a few coveted "Day Last Seen" 500 point scorers on his subject, which came from his pursuit of a possible husband in the photos of the subject which he matched to someone in her network, whilst he was pursuing his theory of an elopement.
The points system
We were relatively unfamiliar with the scoring process. We each had a copy of the scoring table in the very handy guide that has been produced for those taking part, however, I didn't familiarise myself with it as much as I could have. I had a rough idea but I hadn't committed each field to memory, which meant that I had a lot of checking to do during the competition. We also identified relatively late into the competition that we were being too hard on ourselves. We all do this for a living and we were looking at the intelligence from the point of view of what would go into an active subject profile. Quite late into the competition, I submitted a CCTV image from the day the subject was last seen that I had seen online about 5 times. I assumed that it had been released by Law Enforcement and as such, they don't need to see it again. However, the competition exists to compile the evidence in one place. This was an easy 500 points that I'd been ignoring by looking for more elusive intelligence.
Qualify and capture your submission
When you are submitting your intelligence you are asked to provide both an explanation for it and the relevance of the information. If you fail to do this in enough detail the submission may get rejected. If you think the submission adds value then have to explain why to help the judge see if from your point of view. Make sure to explain how you found the information and why it might help investigators find the missing person. You're also given the opportunity to add a screenshot of the submission, which I pretty much always did. To do this quickly I relied on the Full Page Capture option on the Nimbus Screenshot & Screen Video Recorder. Even if you're just submitting a URL, a webpage can change. The competition is there to help find someone, so if the page has changed by the time it finds it's way to law enforcement then the relevance may be lost. In my eyes, it's crucial that each page is downloaded on the day of capture.
Facebook is a goldmine. I conduct Internet Investigations pretty much every working day and it's always my first port of call when it comes to Social Media sites. People are happy to post everything about themselves on Facebook, which makes the life of an Internet Investigator much easier. The majority of our points came through Facebook, with the other social media sites like Twitter and Instagram picking up the slack and helping us fill in the blanks to help us move forward on Facebook. However, don't forget other options like Snapchat, TikTok and Strava. You need to be really familiar with how to navigate these social media sites from the point of view of an investigator, rather than an everyday user.
Take regular breaks
Every hour of the competition, we got together for a 5 -10-minute Zoom huddle. This meant stepping away from the mouse and keyboard and discussing our progress. This might seem counterintuitive, as it meant time away from looking for points, however, it allowed us to strategize where we were best to look for further information. In these chats, we discussed our theories about the subjects, which helped us direct our next steps as we looked for further relevant information. We also each took about roughly forty-five minutes or so to step away from our screens and go off and make dinner. The competition is only six hours and this won't suit everyone as a strategy. However, this time away from the competition made me feel more refreshed and re-energised to continue.
Setting up your workstation
As a standard, I use a VPN at all times when I'm surfing the web, so I had this on. I didn't bother with a virtual machine for the CTF. I wasn't expecting to come across any dangerous files or any major risk from any activity that I was conducting, so I didn't deem one to be necessary. The only major benefit I can see for a virtual machine in the CTF environment is the pre-installed tools. Trace Labs has launched a brand new virtual machine with preinstalled bookmarks and python tools that could come in really handy, particularly for those who are newer to OSINT. I did however set up my browser with the sites that I expected to use. I logged into all my sock-puppet accounts and had social media sites ready, as well as loading up some username searching sites like Namecheckr and Whatsmyname. Having these sites ready to go meant I was immediately ready once I had a subjects name. I had another browser window open that contained my search party submission tab to submit evidence, along with tabs for Slack and the scoring system. My team kept in touch throughout the competition on Slack, so having this on a window separate to my investigative work was crucial. I very quickly realised that a second screen was invaluable, so it was a shame that I didn't have one. On my desk at work, I have two 28 inch monitors and my laptop screen when I'm conducting internet investigations, so when I usually work from home I feel like I'm slumming it with my one 28 inch monitor and laptop screen. During lockdown, I've been located away from all of my monitors, which meant all I had to rely on during the CTF was my 13inch laptop screen. Split-screening was difficult, particularly on such a small screen, so I would definitely recommend a second screen.
Familiarise yourself with current OSINT tools and techniques
My team was made up of professional OSINT specialists, however, a lot of competitors are OSINT hobbyists or are even brand new to OSINT. Trace Labs has two official training partners that are involved in supporting training for competitors; Joe Gray and OSINT Combine. I didn't manage to check out the training from either provider, however, I heard good things about both. If you want to familiarise yourself with the tools and techniques that my team relied on, then I recommend checking out my OSINT course on Teachable. My team relied almost exclusively on the tools and techniques that are demonstrated within this course. My specialist skill set within OSINT is finding people online, that's what I do for a living, and so my course provides a lot of different approaches to assist with this, with a detailed section on the best current techniques for searching on social media sites. If you intend to take part in a future CTF then the content of my course would put you in a great place for attempting a podium finish.
Publicise your efforts
Trace Labs exists for one reason, to help expedite the family reunification of missing persons. This can only be achieved through the crowdsourcing efforts that take place like the CTF's. It's paramount that for the continuation of these events to take place there are people to take part. Publicising these events is going to be a major winner for Trace Labs, as it will mean more potential participants. Trace Labs has just made this much easier. Participants, whether they were competing or whether they were a judge all get a participation badge through Badgr that can be easily shared online. I've long considered using Badgr for my own courses, so it's great to see it work in practice and it's prompted me to finally start issuing these. Teams that finish within the top three teams on the leaderboard also receive a black, silver or bronze badge as well, which is really cool. My team and I discussed before the competition started that there doesn't seem to be any major players currently recruiting directly out of these events at this time. OSINT skills are ever-increasing in their demand in the workplace and these events are the perfect place to pick up some real talent. So even outside of raising awareness of Trace Labs, I would recommend that you publicise your efforts in these events, who knows when potential employers might start paying a little more attention to the possibilities of these events.
The next Trace Labs Global OSINT Search Party CTF takes place in August and is scheduled to be announced very soon. If you have any interest in OSINT at all then make sure to sign up, it's a great opportunity to do some work to help the global community, learn some new tricks and just have an absolute blast!