New Year’s Eve: Musings on Y2K
At 3pm PST on 31 December, 1999, I sat down at the computer in my home office in Yakima, Washington. I logged remotely into the network at HQ and started monitoring our systems. The most critical moment would come at 4pm local time. We were in Pacific Standard Time (PST), -0800 UTC. In other words, at 4pm in Yakima, it would be midnight in Greenwich, England, where the time zone aligns with Coordinated Universal Time. (Coordinated Universal Time is abbreviated as UTC, not CUT, because there are actually other languages in the world besides English, and… never mind. Look it up if that story interests you).
Anyway.
The GPS satellites run on UTC, and our entire multi-state operation depended on GPS timing. My first hint of system failure because of a Y2K bug would occur at midnight, UTC.
Beginning at 3:55pm I began testing the major system once a minute. At 4:05pm I sent out the notice to corporate management that all was well.
I tested hourly, then, but the next critical moment wasn’t until 9pm PST, which was when midnight occurred on the US East Coast. Our equipment was all in MST and PST, but some of our many telecom providers might have systems with local time coordination in some other US time zone. (They’d all be using GPS now, but – this was 1999, and US telecommunications had plenty of legacy systems with other clocking methods).
In the end, nothing failed. Our entire system worked.
This wasn’t because Y2K was overblown.
It was because we replaced our billing system, which wasn’t able to generate an invoice after the date flip.
It was because we did software updates on several proprietary systems that would have failed.
It was because we did firmware updates, too.
Equipment inventories.
Application inventories.
Operating system inventories.
Software version inventories.
Firmware version inventories.
The reason January 1, 2000 seemed like such an ordinary day is because of the MASSIVE amount of work and money spent to make it ordinary. There are unsung heroes around the world who put in the work to update or replace systems that would’ve failed otherwise.
If you’re one of those people, I would love to hear your story.
Overheard an infosec dude badmouthing an employee who recently failed a phishing test.
Dude! Not clicking on links and identifying attacks may be YOUR job, and I hope you’re great at it (your org depends on it). But most employees have a DIFFERENT job. Help them do that job safely, then get out of their way.
It was super fun to interview @jerry for this week's episode of the Infosec Sidekick Podcast!
I had wanted to do this a while back; when the heat of the twitter migration was taking place, but I almost feel like now was a better time.
With the dust somewhat settled, @jerry and I talk about Information Sharing, Community Building, and how Mastadon plays a role in that.
I genuinely appreciate this conversation and hope it can provide you some value and entertainment throughout your week.
You will be sure to find gems in this episode, such as the unlikely comparison to twitter vs mastadon as Monsters Inc. Power Generation (don't ask, just listen lol)
If your first instinct is to try and find blame when a security vulnerability is pointed out...
...you have already created an environment where everyone will hide issues from you.
You currently live in a fake reality where you think everything is fine and you have no idea the rot that is underneath you.
If you fire or punish a person every time a vulnerability is found, you will have no one left. Hell, fire yourself first to save us all the trouble.
Vulnerabilities exist. The world changes. Software changes. Attacks change. Business needs change.
Life is fucking impermanence.
So create an environment where folks come to you quickly and tell you what needs to be fixed as they find it.
How do you do that?! Reward vulnerability discovery. Reward mitigations. Reward patch management. Reward security improvement. Reward safety improvement.
Hi, Mastadon, I’m a Sr. Security Engineer with more than 15 Years of experience building reliable telecommunication infrasturcutre at global scale.
I’m looking for work one of these domains.
Cyber Threat Intelligence (CTI)
Detection Engineering
Jr. Software Engineering
Pre-sales engineer (B2B SaaS)
News about significant data breaches appear to break on a daily basis now. Yet some (business) people still give me strange looks when I tell them that the best way to protect data is to not have it stored. 🙄 You can‘t lose what you don‘t have. It‘s that simple. 🤷♂️
— #privacy#DataProtection#GDPR#InfoSec#InformationSecurity#DataBreach#DataBreaches
Super weird to me that Dropbox has told Dropbox Sign customers to "delete your existing entry and then reset it" if they use app-based MFA. I have never seen "delete your MFA and create new tokens" in post-compromise account hygiene advice before.
I suspect two things:
1.) Dropbox was storing plain text MFA seeds right next to their password hashes
2.) We're going to hear a lot more about this soon.
For two decades, I've heard security professionals urging organizations to "just patch your stuff" as though they don't already know that and/or it's as simple as saying those words. This is where real data and "thought leaders" differ. The data acknowledges that things aren’t so simple in the real world because vulnerability remediation is a moving target (new vulns are found as old ones are fixed).
We measured the remediation capacity of hundreds of organizations over a 12-month period. To do this, we calculated the total number of open (unremediated) vulnerabilities in the environment and the total number closed each month. We then averaged that over the active timeframe to get a monthly open-to-closed ratio for each organization and created a log-log regression model. The results are recorded in the figure below, where each organization is plotted along the grid. And those results are INSANE!
The R2 statistic for this log-log regression model is 0 .93, meaning that it’s very strong and captures most of the variability around vulnerability closure rates. You can see this visually in the figure because all the points—which represent the remediation capacity for each firm—fit tightly along the regression line.
Strong models are great, but there’s something else we learned that’s greater still. Notice first that each axis is presented on a log scale, increasing by multiples of 10. Now, follow the regression line from the bottom left to upper right . See how every tenfold increase in open vulnerabilities is met with a roughly tenfold increase in closed vulnerabilities?
That, in a nutshell, is why it feels like your vulnerability management program always struggles to keep up. And why "just patch it, stupid" is ignorant and unhelpful advice. A typical organization will have the capacity to remediate about one out of every 10 vulnerabilities in their environment within a given month. That seems to hold true for firms large, small, and anywhere in between.
So is there no hope? Are vulnerability management programs destined to slowly drown in a quagmire of their own making? No! We did observe organizations that managed to drive down risky vulns in their environment over time...but that's another story for another post. Follow / stay tuned for their secret (hint: it doesn't require buying a product).
My book 'PROPAGANDA: from disinformation and influence to operations and information warfare' treats the subject adequately, comprehensively, broadly, expertly. Information surrounds us. How does information influence work? An expert arrangement of the subject. https://blog.lukaszolejnik.com/propaganda-my-book-on-information-security/
🚨 ALERT: A massive ad fraud botnet called PEACHPIT has been exposed. It exploited hundreds of thousands of #Android and iOS devices to generate illicit profits for cybercriminals.
🚨 New Threat Alert! BunnyLoader, the latest #malware-as-a-service, is up for sale in the dark web. It can steal your data, replace your #cryptocurrency addresses, and more.
Security folks, I need some help. My wife is looking for a job after taking a few years off to take care of the kids and she's having a hard time finding legit security opportunities. And the legit ones she does find don't like the gap in her resume.
If you have or know of any legit remote openings for someone with experience in identity and access management, can you please share?
She has her CISSP and while most of her experience is in IAM she's willing to branch out and learn a new specialty. She also happens to be both the faster learner and the smarter one of the two of us!
Features the Features the Hunton Andrews Kurth LLP blog post, "Utah Publishes Proposed Rules for Age Verification and Parental Consent in Social Media Law."
NOTE: I don't always have a chance to post my newsletter to social media. BUT...if you subscribe to receive via email, you won't miss it! (It's free, see registration link on my blog post page)