Over the last weekend, I was able to go to Shmoocon for the third year in a row. This year, however, I decided not to work on CTF challenges the whole time while “Listening” to talks. Instead, I was present at the talks, and it turns out, Shmoocon has some great discussions. While most people at Shmoocon typically try to take a lot of technical advice and come back in full force ready to tackle new exploits or something of that nature, this year I left in a more inquisitive than fired up ready to “pwn.” In this post, I hope to explain four topics that have made me think a lot since the weekend and things that maybe we don’t think of enough in the security community or the dev community.
The user isn’t stupid
Every year at Shmoo, one of the founders of Shmoocon, Bruce Potter, gives a rant about the industry and changes he wishes to make in the industry. Every year, I think I have heard Bruce talk about the same thing the users aren’t stupid. We need to start building better tools to protect the user. This got me thinking a lot about what we do in the security industry, I see this done in many areas of business, but we often call those who use bad passwords dumb, or we think someone is stupid because they can’t figure out things in our programs that we wrote. Often the user is not actually at fault. It is our fault for not educating the masses.
This includes any area of business really, I have seen people call clients stupid for not being able to do the work the company does, although if they did know how to do it, we wouldn’t have jobs. I remember a talk with a friend of mine recently who is in User Experience, and he said: “Developers and IT People are not great at designing tools, they work great usually, but internal tools are always a pain to work in.” And if we are honest, it is true. While some tools that we design function great for our needs, 80% of the staff is going to be confused by the buttons or naming convention or something else. We think differently than most. That is why your CEO doesn’t have 2FA and uses the same password for Facebook as they use for their corporate email. They haven’t been trained to think like a hacker, but they have been trained to make deals and sell an organization.
This talk made me think more about why I want to be in this industry. I think most people want to be in the industry for one of two reasons, they want to break stuff and not get arrested, or they want to break things and protect users. I hold firm that my purpose is to protect others, breaking stuff is fun, I find trying to figure out solutions to challenges exciting, but at the end of the day, I work in this industry to help others know how to protect themselves from people like me, who want to harm them.
Bruce ended by encouraging us to get better at what we are currently doing, instead of trying to learn everything, which leads to the next takeaway.
I need to stop working more and start working smarter
Bruce also noted that the security industry has a hard time selling our industry because we also tell everyone they must work off the clock consistently to learn. This leads to a massive amount of burnout and fatigue and makes us irritable with future generations. The security community prides itself on people who work 60 hours a week, with at least 20 of that being off the clock on our own time. As a leader, it made me think more about learning opportunities for the employees around me. What can I do as a leader to provide more learning opportunities for the staff at my organization without forcing them to learn on their dime?
I must admit this is a challenging endeavor, first because I must practice what I preach. I am the master of always working on something or some goal. Secondly, because I work for a small organization and it feels like there is always too much work to do. Four years ago, when I started in IT, I learned a lot of things by doing, I was pushed to learn new things because if I didn’t learn it, it would not get done. Luckily, my bosses gave me the time I needed. Maybe that is where it starts, giving others small projects that they can learn on, and that pushes them to do better. Training sessions and mentoring from the leadership will also help mold employees. The security community is small, and sometimes it seems that it is difficult to learn. Leaders need to start mentoring those under us (and I am still very new to the field also, I need mentoring myself) and give them the chance to learn when they are on the clock, not off the clock.
What was a great feature, could turn into a bug
During the opening day, I got to see a talk by Jonathan Leitschuh, who is mostly known for finding the Zoom Zero-day in 2019. What I found most interesting about Jonathan’s speech was his interaction with Zoom developers. They insisted that the vulnerabilities that Jonathan found were features that made Zoom attractive to their clients. While the webserver was a feature in the Zoom product that made it easier and more efficient to use the product, it was creating a backdoor that put customers at risk.
This could be true about just about any application. Most developers did not learn security, and honestly don’t think like hackers most of the time. Their jobs, in most cases, are to get a product out as fast as possible that meets the needs of the organization. Many of these developers are being pushed to do things much quicker. I have been pushed to get products out the door in my organization very quickly, and it has prevented me from doing the job that I would have liked. I’m sure most penetration testers could tell you that they wish they had longer to poke around for holes in a network as well. The point is, in the security community, as well as in all industries, we need to learn to adapt and understand that many of the “features” few are looking for in things, can create opportunities for holes that we weren’t expecting. As a security person, I need to understand that not everyone will think the way I think and that I need to be understanding of that and use it as a learning experience and not as a whipping stick. As a developer, I need to understand the scope of a program I am writing and think about the holes that could be opened because of a feature that makes things convenient.
Security isn’t just CVEs it’s also misconfigurations
Lastly, I enjoyed a talk by Mark Manning on Kubernetes security for Pentesters. This was an excellent talk primarily because Mark notes that his talk was not on Zero Days in Kubernetes, but it was on misconfigurations. This is something I struggle with myself; I think we often focus on zero-day attacks or CVEs, but we don’t focus enough on proper configuration. Every conference seems to be focused on zero-days or new vulnerabilities. I rarely ever see a talk where a person shows how they pwned a system because it was not correctly configured, yet that seems to be the nature of a large number of attacks. The Capital One data breach did not happen because of a vulnerability that was disclosed and unpatched, or some new zero-day that was dropped. It happened because one of Capital One’s databases were left exposed to the internet. It has shifted my train of thought. There are likely three times as many systems that are vulnerable to attack over the internet because of misconfigurations as there are those that are vulnerable because of a zero-day.