When it comes to open-source intelligence (OSINT), there are a lot of people jumping in without following best practices, leading to bad analysis and misinformation. OSINT isn’t about getting a ton of followers or throwing out flashy claims—it’s about careful research, collaboration, and showing your sources so others can verify your work. There are a few major mistakes that I’ve seen people make that can really undermine good OSINT work.
First, not citing the original source is a huge problem. OSINT is supposed to be open, meaning anyone should be able to trace a piece of information back to where it came from. If someone posts a video or an image without a direct link to its origin, verification becomes nearly impossible. The first instance of a file often contains valuable metadata that gets lost when it’s re-uploaded or altered, so tracking down the original is critical. That said, sometimes linking to certain content—like graphic violence or hate speech—raises ethical concerns. But generally speaking, transparency is better than hoarding information.
Another big mistake is letting personal bias take over. Everyone has biases, but when you’re doing OSINT, you have to separate your opinions from the facts. Just because a piece of information supports what you already believe doesn’t mean it’s true. Confirmation bias can lead people to cherry-pick evidence, ignoring contradictory data. The best OSINT work is honest about what’s known, what’s uncertain, and what’s completely unknown. Even if you have a strong opinion on a subject, you should still acknowledge when the data doesn’t fully support your view.
Not archiving material is another rookie mistake. The internet isn’t as permanent as people think—webpages get deleted, social media posts disappear, and sites go offline. If you don’t save a copy of what you find, you might lose it forever. Using services like the Wayback Machine or archive.today is the best way to preserve web content, but even taking screenshots is better than nothing.
Context is everything. A lot of OSINT mistakes come from misinterpreting normal events as something extraordinary. For example, if you don’t understand how NASA’s fire monitoring tools work, you might misread controlled burns as evidence of attacks. Similarly, people often jump to conclusions when they see flight tracking data, assuming a plane’s movement means something significant when it doesn’t. Knowing what’s typical versus what’s unusual requires domain expertise, not just a good eye for patterns.
Misusing tools is another major issue. Just because a tool exists doesn’t mean it’s always accurate. Facial recognition, for instance, can give false positives. Image forensics tools can be misread. Even satellite imagery can be misleading if you don’t know what you’re looking at. OSINT isn’t about throwing data into a tool and accepting the first answer you get—it’s about understanding the limitations of your methods and cross-checking results.
Another frustrating habit is editing footage in ways that make verification harder. Some OSINT accounts slap watermarks all over images, add dramatic audio to videos, or cut footage to make it seem more exciting. The problem is that these changes can remove crucial details that might help confirm or debunk a claim. Keeping content in its original form is always the best practice.
Finally, there’s the rush to be first. When big news breaks, there’s an incentive to post something fast, especially on social media. But rushing leads to mistakes, and in worst-case scenarios, those mistakes can have real consequences. Misidentifying innocent people as criminals, pushing out unverified claims, and failing to fact-check before posting all contribute to misinformation. Being right is always more important than being first.
OSINT is a powerful tool, but only if it’s done right. The best researchers are the ones who prioritize accuracy, transparency, and ethics over clout and quick engagement. If you’re serious about OSINT, avoiding these common mistakes is a good place to start.