Connecting the Dots
Search Posts


sun | mon | tue | wed | thu | fri | sat |
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 |
Recent Posts

Categories


On July 31, 2018 I attended the first National Cybersecurity Summit at the US Customs House in lower Manhattan. The building itself was constructed around 1902-1907 in order to collect tariffs. Teddy Roosevelt was President and tariffs were a subject of divisive national debate. Global issues were still in evidence at the Cybersecurity Summit, with the administration promoting new initiatives to protect US critical infrastructure and democratic processes. In attendance to support these new initiatives were: Vice President Pence, Energy Secretary Rick Perry, FBI Director Wray, General Paul Nakasone (NSA and US Cyber Command), Kirstjen Nielsen, Secretary of DHS, Chris Krebs, head of DHS’s NPPD (National Protection and Programs Directorate) as well as CEO’s from industry and leaders from academia. Audience members filled the 350-seat auditorium and spilled over into another viewing room down the hall.
So, what was new, if anything? Secretary Nielsen announced the new National Risk Management Center (NRMC), designated to be a focal point within government for private-public collaboration on cyber related risk issues. You can find the fact sheet on NRMC here. Interesting that the word “cybersecurity” is not in the name of this group. Two thoughts: maybe she is thinking the term will go out of favor. Also, many of the real risks to society and the economy are second and third order effects, not just the initial cyber-attack consequences. To start, the focus in NRMC will be on the financial sector, energy sector and ICT (Information and Communications Technology) sectors. A 90-day sprint will be initiated. The NRMC Director is yet to be named.
A second new direction was articulated by Vice President Pence, when he argued that the previous administration had been weak on cyber preparation and response; now the Trump administration is reversing that strategy with stronger action in both areas. Given that everything in DC must have a political component, this sounded like one positive step for better cyber security both within government and in the private sector.
The NRMC sounds promising; I am hoping it does not just focus on incident detection and response. Risk management includes the whole lifecycle, from identify, protect, detect, respond to recover. I would like DHS to share more proactive information regarding cyber-attacks. The 2015 Cybersecurity Information Sharing Act did call for the Federal government to share best defensive practices based on ongoing analysis of threat indicators. I call this “evidence-based security”. This is needed to develop cost effective defenses ahead of the next attack. Unfortunately, the supporting legislation in Congress, HR 5074, does seem to focus on attack detection and remediation. Another new private group, the Financial Systems Analysis & Resilience Center, is focusing on analysis of strategic cyber risks within and between member banks.
One more note from the Summit: the President’s NSTAC (National Security Telecommunications Advisory Committee), which has been working on a Cyber Moonshot study, will report out in the next couple of weeks. This could be the overall risk management and mitigation program that we need.


Two recent privacy laws—GDPR and the California Consumer Privacy Act (AB-375) --focus more attention on protecting digital privacy of individuals. Both laws will require that security professional up their game. In this post I will cover some of the security implications of AB 375. Gone are the days when privacy requirements could be handed off to privacy officers or legal counsel. Today’s requirements are so granular that they will require new security technology, processes and knowledge.
To summarize the California Consumer Privacy Act of 2018:
- It goes into effect January 1, 2020
- It includes a private right of action in breaches involving unencrypted or nonredacted personal information
- It offers California citizens the right to
- Know what information is being collected about them
- Know if their information is being sold and to whom
- Forbid sale of personal information
- Gain access to their personal information
- Retain their rights to equitable service even if they forbid sale of their information
- Exceptions are made for business that are not located in California and do business outside of the state. This exception would apply to Las Vegas casinos, even when serving California citizens.
What are some of the implications of these rights for security professionals? Broadly, they fall into the requirements for confidentiality, integrity and risk management. One area is data classification and handling. Often neglected in risk management, it is now front and center. Businesses must know what information they are collecting and where they are getting it from. Businesses will have to respond to consumer requests regarding the categories of information they keep about consumers. Classification must include: categories of information; specific pieces of information collected; sources of information; commercial purpose for collecting; third parties to whom the data is sold; whether the information may be sold or not.
The definition of “personal information” is now broader than what many consider at first glance. AB-375 defines it as: “information that identifies, relates to, describes, is capable of being associated with, or could reasonable be linked, directly or indirectly, with a particular consumer or household. Identifiers include: name, address, IP address, email address, browsing history, search history, geolocation data, employment information, audio information, etc. More categories of data will need to be protected by organizations covered by this law.
Consumers now have the right to request deletion of their information. This mandates that data flow diagrams be created showing the lifecycle of the data. These have been required by PCI DSS and now will be required to effectively assure data destruction of other categories of personal information. More demanding are third party contracts, which now must require data destruction on an individual record basis. Security officers will need some type of assurance that this is being done.
AB-375 does not restrict businesses from collecting, using, retaining, selling or disclosing information that is deidentified. The bill requires that businesses have technical controls to prevent consumer information from being associated to a consumer, either directly or indirectly. Security professionals will need to understand how “deidentified” is interpreted under this and other privacy regulations and be prepared to support the definition with technology.
Another issue is authentication of consumers who request information about their data. The law requires that a response be provided within 45 days (extensible to 90 days). The security team will need to have a process for verification of the identity of the consumer before any information is released.
For overall risk management, AB-375 provides some financial penalties to document security impact. Damages up to $750 per incident per consumer may be sought in private action by the consumer. If your firm maintains records on 1000 consumers, you could be liable for $750,000 under a class action. In addition, the California Attorney General can bring a civil action against a firm in violation of the law and fines up to $7500 per incident can be levied.
This post illustrates the new frontier for security officers: privacy technology. While not completely new, the teeth provided by GDPR and AB-375 suggest that we all step up our knowledge of privacy technologies and processes.



Yesterday DHS and the Commerce Department released their most recent workforce report “Supporting the Growth and Sustainment of the Nation’s Cybersecurity Workforce”. The report was commissioned by the Trump administration in May 2017. Having studied this issue from roles in academia, private industry and government, I thought I would share my thoughts on the report.
Overall, I thought it does a good job and provides good ideas for improvement. I have always had a bone to pick with reports of astronomical cybersecurity job shortages. The “Cybersecurity Workforce” report states that there are 299,000 active openings for US cyber-related jobs. OK, but when I search (cybersecurity + cyber security) on www.indeed.com I find a total of 53,007 jobs. Somehow 82% of the jobs are not found on Indeed. Where are they? The DHS/Commerce report does acknowledge that we really don’t know how many jobs are open and exactly what industry and government needs. What is the cybersecurity workforce and where does it need to be? This industry is changing so fast that answering that question may be difficult. I see MSSP’s and cloud security services both growing very fast; this will reduce the overall numerical demand.
The report highlights the need for cross training. I have long thought that more security roles need to move into the business. There are people in those domains that have a good security aptitude and, with some security training, can be extremely effective. 90% of their effectiveness would be just knowing the business domain. At the same time, report findings note that “employers increasingly are concerned about the relevance of cybersecurity-related education programs in meeting the needs of their organizations.” Later in the report, mention is made of educational programs that focus on technical skills without including the many nontechnical skills needed to implement a security program. That is one of the gaps being identified.
Two other good points include emphasis on the ideas of apprenticeships and certificate programs for cross disciplinary education. Every type of career training can benefit from apprenticeships or internships. Why is this more important for security education? For one thing security must be holistic. There can be only a very few people who are individual contributors. Certificate programs for individuals like project managers, business analysts and contingency planners would greatly improve the uptake of security in an organization.
Another very good point brought up relates to career paths. What is the cyber security professional career path? Especially as more workloads move to the cloud and more AI is introduced to SOC’s, what will be the career path? My recommendation is to define security education more around risk management, both information risk and technology risks. A more comprehensive definition at the beginning will permit continued specialization and redirection later. In this way, professionals can expect to be part of any business initiative, all of which will need risk management. Today, almost all business initiatives will include information risks. Since, employers also want new hires to have immediately usable skills, such education must also include specialized training in at least one security technical area.


In this digital era, anything can be faked; followers, news, experts, emails and so on. The possibilities are limited only by the imagination of the faker. It turns out that these issues were addressed back in 1996, by Carl Sagan, the world-famous astronomer. His context was UFO’s, but his formula for separating facts from fiction is even more applicable today. He called his 9-step process “The Fine Art of Baloney Detection” and described it in his best-selling book, The Demon Haunted World. Here is a summary:
- Whenever possible, there must be independent confirmation of the “facts”
- Encourage substantive debate on the evidence by knowledgeable proponents
- Do not overweight arguments from so called authorities
- Spin more than one hypothesis for the evidence
- Try not to get too attached to your own hypothesis
- Quantify competing hypotheses
- If using a chained argument, every link must work
- Occam’s Razor: choose the simpler of two hypotheses
- Can the hypothesis be tested?
Keep this list in mind when you are confronted with information that may have significant consequences for you.


Information security over the past few years has been obsessed with zero day vulnerabilities, hacking exploits and headline making mega breaches. Every security risk manager is looking for the “unknown unknowns” that could result in untimely unemployment. But is that the right approach? One presentation and one book made me think otherwise.
The presentation was Alex Stamos’s talk last summer at Black Hat; you can listen to it here. In this talk he highlights the differences between risks identified by traditional InfoSec and newer risks that he calls “abuse”. This triangle diagram below from his talk captures his point. Note that the vertical scale is a log scale. Mr. Stamos’ definition of abuse is “technically correct use of a technology to cause harm”. Think user profile scraping, insider trading, spam, doxing, sexual exploitation, etc. The log scale illustrates that the biggest risks are found in the category of abuse. Zero days and targeted attacks are orders of magnitude less important. Searching for the “needle in the haystack”, the holy grail of InfoSec practice, may not be rewarding or cost effective.
The book was Gray Rhino, by Michele Wucker. It highlights the risks associated with looking primarily for needles in haystacks and confirmed Mr. Stamos’s thoughts. The metaphor here is the Gray Rhino, which may be attacking while you are looking for the unknown unknowns. Ms. Wucker’s book is written for risk management professionals in general, but by connecting the dots we can apply to InfoSec. Gray Rhino is the counterweight to Black Swan, by Nicholas Taleeb. Black swans are high impact events that we cannot predict. A Gray Rhino is something you see coming, but ignore, for one reason or another. It is a highly probable event, with high-impact. Think of the Equifax breach in 2017. There had been a previous reported breach in May 2016 which I would call a Gray Rhino. Another recent breach is the ransomware attack on Atlanta. Is this a Gray Rhino? Such attacks have been common since 2015. Was the City of Atlanta able to take steps to train users and backup systems? Apparently not yet. How about Facebook and the alleged misuse of user data by Cambridge Analytica? Many InfoSec professionals are looking for hacker attacks. But go back to 2005 and the ChoicePoint breach; this attack could have been a Gray Rhino for Facebook. In this breach, business partners of ChoicePoint exposed data on 163,000 users (a piddling number by today’s standards). This should have tightened security within business units of Facebook.
A zoological risk matrix could look like this:
|
Low Probability |
High Probability |
Low Impact |
|
White Swans |
High Impact |
Black Swans |
Gray Rhinos |
Dealing effectively with gray rhinos requires awareness, both individual and organizational. The reasons we don’t do so comes down to several obstacles:
- Weak response to signals that are seen by many but not followed up on
- Systems that accept as normal a failure to respond
- Impulse to procrastinate (everyone)
- Taboos against raising alarms
- Groupthink
- Too many rhinos attacking at once
This is a short list of causes from the book. All of them apply to information security risk management.
How about mitigations? Ms. Wucker offers some general good ideas that can be applied in an information security context:
- First, acknowledge that your Gray Rhinos are out there.
- Prioritize which rhino you will manage first.
- Accept incremental mitigations and continue to improve on them
- If you do have a security incident, capitalize on it
- Work hard to convince management to take action against distant rhinos before they show up on your doorstep
Going back to information security specific vulnerabilities, the Stamos triangle is a good starting point to look for specific Gray Rhinos. Focus on getting out of the way of these four animals, before looking for targeted attacks or zero day attacks.
Common Information Security Gray Rhinos
- Phishing : User training and repeated training is essential
- Unpatched systems: Do you know the percent of systems, OS’s, middleware and applications that are not patched and the corresponding risk levels?
- Password reuse and mass compromise: Have you implemented and required MFA on all critical systems?
- Abuse: How could your partners, customers and employees misuse your systems?
One of the functions of an outside consultant is to help client identify the Gray Rhinos, whether those above or others. If you are considering this type of perspective, please drop me a line.
Frederick Scholl
freds@monarch-info.com


The recent government shutdown got me thinking about budgets and information security. Having just submitted a proposal to a small business myself, I am asking the question: What is best practice for small or mid-sized business (SMB) information security?
Every SMB is going to have a limited budget. This budget has to cover control implementation and maintenance. There’s no point in minimizing risks if you will run out of money for maintenance at a later date. In this post, I want to address the costs of running the security program on an ongoing basis. Gartner came up with Total Cost of Ownership (TCO) back in the 1980’s but hasn’t applied it to information security. I am claiming that the cost of maintaining your security program is often overlooked and is critical for a SMB, where budgets are limited.
There are several good references for SMB security. NIST has developed NISTIR 7621: “Small Business Information Security: The Fundamentals”. This document recommends taking an inventory of information assets and then using the NIST Cyber Security Framework to protect them. But there is no discussion of managing security within a limited budget. Another good reference is from Australia: “Cybersecurity: The Small Business Best Practice Guide”. This starts with getting buy in from everyone in the organization and providing training. The next recommendations include analyzing vulnerabilities, evaluating assets and prioritizing remediation. Again, no discussion of budget issues.
I believe more focus should be placed on the cost of maintaining security controls that are put into place. For the largest organizations, this might not a problem since resources can be made available. For a smaller organization with limited resources, it is better to implement a smaller number of quality security controls, than a larger suite of controls where some are not supported properly.
As a hypothetical, let’s say the organization has $50,000 to spend on security. Assume each control will cost $5,000 for initial cost in year one. This could include hardware, software, training, consulting, etc. Therefore the budget will support 10 controls in year one. If the maintenance cost is $5,000 ongoing, then no new controls can be implemented in year two. The critical point is to allocate sufficient funds to maintain all controls in following years. Year two and beyond funds cannot just be put into more new controls or technologies. If this happens, the control effectiveness will drop off and potentially permit breaches. This effect is confirmed in the Verizon DBIR which states that almost half of PCI compliant firms are out of compliance in mid-year assessment.
What are the likely maintenance costs of a control? First would be the usual software maintenance charged by the vendor. But a security control is not a stand-alone entity. It must be seen as part of a security system, or its effectiveness is reduced. There will be costs of training staff to make use of the control. Why? Threats are constantly changing. People are changing jobs or roles. There will be costs of developing processes around the control. Why? The other controls or processes communicating with the control are constantly changing. In today’s world, security technology itself is constantly changing, hopefully improving. Therefore, the life of a control implementation will be shorter than other enterprise applications. This depreciation expense should also be included in the security budget.
For a small business, it is important to estimate the annual cost of maintaining each security control. That costs should be added to the initial cost of implementing the control. The total should then guide, along with the risk analysis, the roadmap for control implementation. The worst choice is to implement too many controls and then not allocate sufficient resources to maintain them. This may give a false sense of compliance, but will not provide security.
Contact us today to learn how we can assist you in any aspect of your IT security program.


If only building a security start-up was as predictable as transitioning from caterpillar to butterfly! But, it’s not. Unfortunately it usually requires many turns and corresponding changes. Consider companies like Blackberry, once a ubiquitous handset provider, now an enterprise security provider. Or Radware, once a load balancing product company, now known for its DDoS solutions. The most dramatic change in our industry is Amazon, once a book company, now marketing a whole range of secure cloud solutions.
If you are a start up, you want to avoid the dreaded “pivot” with its associated hard resource costs and, potentially, people costs. How do you keep up with constantly changing marketplace requirements without pivoting? I recently discovered an amazing tool for this purpose, the Business Model Canvas. It’s not brand new, but if you aren’t using it, please read on for a short introduction. For details and much more, please see the original work—Business Model Generation (2010)--by Alexander Osterwalder and Yves Pigneur. It is one of the best practical business books I have read.
Business Model Canvas
This canvas approach allows you to build a picture of your prospective business in one page using a one day brainstorming session. The 9 categories in the canvas above illustrate the key things you need to get right. Notice that technology is not specifically one of them! Of course, it is imbedded in all of the categories, especially “Value Proposition”. Value Proposition is not what you do great, but why prospects will choose you over competitors or over doing nothing. It is the business elevator speech.
The other great thing about the “Canvas” is that it is easy to change. Whereas a formal business plan might fill up 30-100 pages, the Canvas can be changed on a regular basis. This facilitates the incremental, lean approach to business model optimization. In today’s rapidly changing market, this is a critical success factor. The one page canvas forces you to consider all of the components needed for business success. Who are your customer segments? If you don’t have marketing focus, you don’t have marketing.
Osterwalder and Pigneur have some great suggestions on how to build the canvas for your venture, and then how to follow it up and execute. The next step after building the model is developing a formal business plan. Their prototype plan has five section: “Team”, “Business Model”, “Financial Analysis”, “External Environment”, “Implementation Roadmap”, and “Risk Analysis”. The material supporting the canvas can be used as input for each of these sections. For example, “Cost Structure” and “Revenue Streams” will map to “Financial Analysis”. What we have here is a business plan in a box! Now you just need to test it with real customers and tweak as needed.
What if you are an intrapreneur, within an existing business? Osterwalder and Pigneur show how to map their 9 business model categories to the 5 key domains within an existing organization: strategy, people, structure, rewards and processes. By using the language of the existing parent organization, you can achieve start-up goals effectively.


There are many posts on corporate directors’ responsibilities toward the organizations where they are board members. In fact, corporate directors themselves may be targets for hacktivists or cybercriminals and need to make sure they have adequate protection. This protection should include both home and professional office. Directors obviously will have access to sensitive insider information that many unauthorized parties would like to get access to. Many directors will also be targets as High Net Worth (HNW) individuals. Cybercriminals always target the weakest link; as corporate information security improves, they increasingly will target the home networks of key executives or directors.
Breaches such as Equifax have put so much personal information into the hands of criminals, that individuals increasingly will become targets. Directors represent a perfect demographic cross-section to be attacked. Attack vectors may include phishing, ransomware and social media.
Earlier this year, an NSA employee was in the news as hackers apparently stole US government secrets from his home network. Directors with access to confidential strategic or financial information should make sure their home networks are protected above and beyond the usual consumer grade defenses. Another attack path may be through tools and services used by directors. In 2010 attacks were reported against Directors Desk, a NASDAQ meeting portal. It is not clear if any sensitive information was stolen at that time.
What should directors do? First, make sure your home network is built to corporate standards. You need a commercial firewall, not just a consumer router. Most critically, any devices, especially firewalls and routers should auto-update their firmware. Auto-update is now included in Windows 10, most smartphones, and many home network devices, but not in older devices. Anything you put on your network will be found to have vulnerabilities, so this software and firmware update feature is critical to keep hackers out.
Passwords represent a second critical area; many breaches result from theft of user credentials. You should use two-factor authentication to log in to sites with your financial or personal information. Applications for your smartphone such as Google Authenticator and Duo Security generate one time tokens that serve as a second factor. More familiar is the text messaging that many sites still use to send one time codes to users. This process has been deprecated by the Federal government (because of potential eavesdropping attacks), so use the dedicated security apps, if possible. Still, other financial sites do not yet have any two-factor authentication available. For these make sure to use 12 character strong passwords. Such complex passwords should be managed using password vaults like LastPass or KeyPass.
The last factor to consider is encryption. Never store any sensitive data online without encrypting it, using a password known only to you. It is true that collaboration sites like Dropbox do encrypt the data you save there. But they still have the encryption keys and can view the data. These keys can be hacked or stolen by a disgruntled employee. That’s fine for 99% of the information you store online. But for the other 1%, especially personal or corporate sensitive material, only you should have the encryption key. Applications like Boxcryptor integrate with Dropbox and enable you to further protect your information.
These three security precautions will help you keep your personal and professional information secure. Since threats and vulnerabilities are constantly changing, you must keep up to date using the online resources and other peer group information on this topic.
Contact us today to learn how we can assist you in any aspect of your IT security program.


This topic came up because of two recent headlines and one new book. The first was the news that the now former Equifax CISO was a music major, without formal college level tech or security training. The second was the recent article in the WSJ highlighting Bank of America’s new Chief Operations and Technology Officer, Cathy Bessant. Ms. Bessant’s outstanding background includes general management and marketing, but not specifically technology leadership. The book I mentioned is Mark Schwartz’s Seat at the Table (2017). Mr. Schwartz argues that, today, tech leaders need a hard core of tech knowledge and can’t be just managers putting on a propeller hat. He bases this conclusion on the rapid and deep penetration of technology into all business operations and the continued rapid change in that technology. In many cases, business leaders will take the initiative to adopt new technology. In this situation, everyone in the organization is tech savvy; but it is the tech leaders that must maintain a deep understanding and ability to evaluate risks.
Who is right? Are these two Fortune 500 firms behind the times? In this post, I am going to look at some published opinions from four other tech leadership gurus and then my own thoughts.
How about Charlie Araujo’s book, The Quantum Age of IT (2012)? He defines the five skills needed by new IT leaders. These are: (1) IT financial management; (2) critical thinking and analytical skills; (3) communication and marketing; (4) innovation and collaboration; (5) leadership. I think this list is great, but it does not specifically mention deep technical skills.
My next guru is Peter High, in his book Implementing World Class IT Strategy (2014). This book contains many nuggets for implementing change. Scanning through it again, I see that Mr. High advises the CIO to be a tech trend spotter; further that IT leaders should serve as tech consultants and facilitators to the business. These functions will require deep tech skills in those IT leaders.
Next, let’s look at The CIO Paradox (2013), from Martha Heller. I reread the last chapter “The Future of the CIO Role” to look at the conclusions. Comments here include: “every company is a technology company”; “CIOs flexing their technology muscles again”; “one version of the future is the technical CIO”. So, in my view, Ms. Heller also highlights a deep tech role for IT leaders.
My fourth guru is Al Guibord, whose book IT Leadership Manual (2012) provides insight on dealing with IT leadership challenges. This book has a wealth of ideas on developing effective relationships in business; but, does not address the need for leaders to maintain deep technology expertise.
So, adding up the score, three gurus support deeper tech knowledge for IT leaders, and two focus on soft skills. Today, trends in agile/devops and cloud are giving more power and knowledge to business leaders. Jeffrey Immelt, recently departed GE CEO, stated that all new hires will learn to code. Tech leaders are going to need even deeper knowledge…along with all the soft skills. So, yes, I believe the CIO and other IT leaders should learn to code.


In this era of digital disruption, business leaders are turning to technology to keep up. But, will they continue to turn to traditional IT leaders to map out the future? This is the question addressed by Mark Schwartz’s new book A Seat at the Table. Mr. Schwartz engagingly analyzes the present and provides guidance for IT leaders to get and keep a “seat at the table”.
In the beginning, we had Waterfall systems development. CIOs could take orders from business leaders, translate the orders into technology roadmaps, develop milestones and implement systems. Then the business discovered SaaS and the “Shadow IT” department was born. The most recent trend is Agile/DevOps, in which business collaborates directly with development and DevOps engineers are tasked with implementing code. What is the role of IT leadership when business leaders are directing systems development?
Gartner has defined Mode 1 and Mode 2 activities for IT. Mode 1 is “keeping the lights on” and Mode 2 is managing digital disruption, as I see it. Mode 1 isn’t just maintaining a static playbook; it requires development and active management. Mr. Schwartz recounts a project he ran, as CIO, to improve the throughput of an existing application. He had $20m to do this job, but the exact improvement goals were not defined. This is a perfect example of Agile/DevOps applied to a technical problem. The improvements achieved were more than sufficient for the mission.
What about the Mode 2, business disruption type projects? Mr. Schwartz suggests that the role of IT leadership migrates to influencing rather than controlling. If Agile/DevOps iterative, collaborative work replaces Waterfall, then the role of “IT managers” is to let the team run with the problem, instead of holding everyone to predetermined milestones. I agree with this conclusion for many projects. Others, such as meeting an SEC compliance requirement by a specific day, might still be better implemented using waterfall methods.
Chapter 13 outlines the specific things CIO’s and other IT leaders can do to exercise leadership in the new IT world. Here are some of his suggestions:
- Be a driver of business outcomes, not just technology outcomes. This reminds me of CIO Rik Reitmaier’s presentation on how he collaboratively developed a mobile application for taking poolside drink orders at Gaylord.
- Steward of Assets. This is the manager of Mode 1 initiatives. No small challenge, given the years’ of IT investment that falls into this bucket.
- Contributor. The IT leader of the future has to be more technical to help provide a technology map to business leaders. This is interesting, because 5-10 years ago, many thought that the CIO had to be an MBA and not a hardcore technologist.
- Influencer and Salesperson. The new CIO will have to influence the approaches used to develop transformative systems. He/she may also need to sell the organization’s technology to outside investors and customers.
- Enabler. Where will the platforms and systems to rapidly develop next generation capabilities come from? This will be the role of the CIO.
If you are in technology this book provides guidance on where your role may be going. It is well written and insightful. My own conclusion is that IT leaders will need to become like the Roman god Janus. One face looking at the past and how to improve it, and one face looking out far into the future…arriving faster than we can appreciate. Looking at the present alone doesn’t work anymore.
Contact us today to learn how we can assist you in any aspect of your IT security program.


The Equifax data breach illustrates again the need for speed in security management. If the breach was through a known vulnerability, we wonder why wasn’t it patched? If through another path, we wonder why wasn’t the attack detected? We have so many incident and event management tools for servers, desktops and networks, it is hard to believe that Equifax did not have such tools. In the past, breaches like this have resulted from delays in detecting or reacting to attacks.
As the pace of digital business transformation continues to increase, security management needs to increase its rate of change. The OODA loop has been highlighted as a general approach to fast, accurate decision making. Recently I came across a really good explanation of this by John Braddock, a former CIA case officer. You can check out his book on Amazon: A Spy’s Guide to Thinking. Let’s look at what these frameworks are and how to use them in cybersecurity management.
The original OODA (Observe-Orient-Decision-Action) loop was devised by John Boyd, an Air Force pilot in the Korean War. This process was developed to help pilots prevail in combat engagements. Although Boyd did little to document his ideas, others picked them up and they have since permeated business thinking. For example, see Certain to Win, by Chet Richards, one of Boyd’s associates. They key point of the OODA loop is to minimize the time it takes to transit the loop. For cybersecurity this can be: the time to respond to an incident, the time to detect an incident, the time to patch a known vulnerability.
A Spy’s Guide to Thinking gives a very good explanation of the OODA loop; one that is better aligned with information security, not MIG pilots’ behavior. To start with, Braddock redefines the action loop as the D-A-D-A loop, or Data-Analysis-Decision-Action loop. Everyone is security starts with data, usually too much data. Analysis, decisions and actions follow. The speed we go through this process determines success or failure. Another very interesting point made by Braddock, is that the process doesn’t start with data, as in the OODA loop where fighter pilots start by observing the enemy.
This is shown in the diagram, where the starting point is “decision”. In security we need to know what we are looking for before starting the data analysis. There’s just too much data. This includes searching for IOC’s or doing a risk analysis to decide which compliance gap to fix first.
To summarize application to effective security risk management:
- Move through the DADA process at the speed of business change
- Don’t start with data (observe); start with a framework that defines what you are looking for. This includes what the business is looking for.


In recent Information Security news, The Wall Street Journal reported on the upcoming trial of an alleged botnet master. The trial is in progress now.
It is not often that we get a look at the details of a computer security breach, but in this case at least some details are in the docket of the Eastern District of NY. I have downloaded the original complaint of US v. Gasperini here. The accusations include violations of the Computer Fraud and Abuse Act, Wire Fraud, Conspiracy to Commit Wire Fraud, and Conspiracy to Commit Money Laundering. All of these acts were allegedly undertaken in a click-fraud scheme. If you want to understand the details of these accusations, I uploaded the judge’s jury directions here.
The defendant allegedly hacked into QNAP NAS devices using the Shellshock vulnerability and downloaded click-fraud software. This is a network device that many people will not patch regularly. Unfortunately, the court transcripts don’t describe how he got past firewall security.
How did the government find out about this? Apparently from an informant, “CS-1”. The fraud scheme was carried out using a “target” website which ran ads from victim companies. Payments based on clicks were then made to the defendant, according to the government.
The defendant’s prospective expert testimony gives some ideas on how he will challenge the government’s case. Given the complexity of the Internet advertising business and the tracking and verification techniques, is it reasonable that this fraud could be carried out? They are also going to testify as to the details of the scripts used and forensic examination of servers.
It will be interesting to see the outcome of this jury case; maybe next week. In the meantime, patch your servers!


Every year, MIT Technology Review publishes its list of the 50 smartest companies. This year, two information security companies made the list, along with big time players like Amazon, SpaceX, etc. TR doesn’t publish the detailed selection criteria, but they include things like: ability to dominate the chosen market and innovative use of technology. The two security companies on the list are pretty much unknown in the general US marketplace, but according to TR, are not likely to stay that way.
#11 on the TR list is Face++ (faceplusplus.com), a business that has gone beyond startup in facial recognition. The company is based in China where its technology is imbedded in many online services. Other companies such as LTU (www.ltutech.com) have pioneered in image recognition. Face++ has concentrated on facial recognition. Its $1B valuation may well be supported by the Chinese market alone. It’s not clear whether this technology will be popular in the US, where many people may not want their facial images stored in a database. Despite the fact that we are already all on Facebook.
#35 is ForAllSecure (forallsecure.com), a start-up out of Carnegie Mellon in Pittsburgh. Their mission is securing code through machine analysis techniques. Can their technology, known as Mayhem, work? They did win last summer’s DARPA Cyber Grand Challenge. Is there a market? Definitely. Code security now is a hodge podge of static analysis, dynamic analysis, dependency checks, pen testing, etc. But time is of the essence, as agile/devops pipelines turn out new deployments at an accelerated rate. A proven and reliable machine technique for finding and remediating bugs would revolutionize information security.


Play Bigger is a new book by entrepreneurs for entrepreneurs (2016, Harper Business). The authors’ theme is that today’s markets are so crowded that you cannot rely on niche marketing into white spaces; you have to create your own white spaces, or “categories”. The goal is to be a “category king”. The idea of niche marketing has been around forever. Ries and Trout documented these ideas in their classic, Positioning (2000). Authors Ramadan, Peterson, Lochhead and Maney propose that in today’s markets, with enough money, genius and hard work you can create your own category. To build a business using their approach you need to create a category, a product and a company. They all need to work together. This is sound advice. The challenges are: what is your idea, how big is your category and is it defensible? IPads, ERP software and SaaS are examples of unique new categories that have gone to the business hall of fame. Even if you don’t have ideas this “big” you can still take away very useful ideas from Play Bigger.
One observation is that a single “lightning strike” will not create a category. Only a series of connected initiatives will have the desired effect. While reading Play Bigger, I also watched “The Founder”, a movie about Ray Kroc’s life (highly recommended). In one scene he plans a lightning strike opening of a new McDonald’s in California. Unfortunately, a swarm of summer flies drove away customers. He was undeterred, obviously, and kept redefining McDonald’s categories from local hamburger stand, to regional business and beyond. The one major contributor to his success: persistence.
A second observation is that new categories come in all sizes and can be employed by new or old businesses. Last week I got a flyer from King Arthur Flour, a 227 year old, Vermont based, B2C provider of flour and baking equipment. It advertised flour and baking pans for baking your own hamburger buns. In my mind that is a different category, not just a better product. With different you can become a category king; with better you will be one of the pack.
The final chapter of Play Bigger applies the category concept to individual professional life. The authors suggest that you need to create your own category, obtain the backup knowledge to support it, and set up an ecosystem (mentors) of people to support your goals. Good advice for all.


The current worldwide attack from WannaCry is going to have lasting impact for information security. The question is: what will that be and who will benefit? In this blog post I will take a contrarian viewpoint and suggest that it will not be beneficial to security practitioners or security businesses. I think business leaders, who fund security programs, will take alternative approaches to mitigating this risk.
At present, we have over 1600 security firms offering solutions to attacks like WannaCry. Unfortunately, this patchwork quilt mitigation approach isn’t working. Not because of the security firms, but because there are too many potential leaks in the ship to manage. So, I predict that business leaders will change ships and increasingly move legacy systems and new systems into the cloud. This is already happening and incidents like WannaCry will accelerate it. No business person is going to upgrade XP systems to Windows 2016, when they can hand over security responsibility to someone else. Of course, security is still a joint responsibility, but the optics make it look like the cloud vendor owns it. For a good summary of cloud growth trends today check out Forbes here. The consensus summary for cloud growth is 18-19% per year through 2020!
The other beneficiary will be the cyber insurance industry, especially in areas hardest hit. WannaCry brings cyber remediation costs to the attention of the board. Board members understand enterprise risk and insurance mitigation. They don’t understand the technical details of ransomware and phishing attacks. But before cyber insurance takes off we will need a common language to describe risk. The NIST Cyber Security Framework (CSF) represents such a framework. The new Cybersecurity EO requires that Federal Agencies utilize the CSF. If this moves in broader use in private industry, we will have a basis for stronger insurance mitigation of cyber risks. Security professionals need to understand the benefits and limitations of these insurance policies and include them as an active part of threat mitigation.

Last night I went to a screening of Laura Poitras’s movie about Julian Assange. If you are interested in national security, I highly recommend the film. I had expected a big crowd, but Nashville’s Belcourt was only about 20% full.
Love WikiLeaks or hate WikiLeaks, it is likely Assange will continue to be in the news. The movie ends with Attorney General Session’s statements directed toward putting Assange in jail. The rest of the movie covers the period back to 2006, when WikiLeaks was founded. You can come to your own conclusions as to whether WikiLeaks was or still is a valuable publishing forum.
I came away with questions such as: who is funding this organization? Or, what is the public benefit of disclosing the Vault 7 CIA documents? Should WikiLeaks be using Twitter to promote the hack of the Macron campaign, as some have reported?
The media blitz surrounding everything Assange does (including this movie) is shocking. Probably not for those participating, but for those of us on the other side of the camera and microphone. How can anyone make rational decisions in the glare of these lights? I don’t think it is possible.


One of the biggest cyber threats that many US companies face is theft of their intellectual property (IP). This includes trade secret, patents, software and copies of tangible goods. The recently released “Update to the IP Commission Report” gives tangible, current information on all four categories. The original report was published in 2013 amidst headlines about Chinese cyber-attacks on US businesses. The conclusion of the February 2017 update report is that IP theft continues, although the headline grabbing thefts may have dropped off. Not all intellectual property results exclusively from a cyber-attack, although most thefts are cyber enabled.
The most notable report points related specifically to trade secret theft are:
-
Total loss to US economy in the range of $180-$540 billion per year
-
IP-intensive firms responsible for 35% of jobs in US labor force
-
No evidence that China has stopped hacking US firms
The updated report also includes recommendations, including those for improved cybersecurity:
-
Implement vulnerability mitigation measures such as information sharing
-
Support US companies that can identify and recover IP stolen through cyber means
-
Reconcile needed changes in the law with changing technical environment
The report is a good read for anyone interested in locking down trade secrets and other intangible assets against cyber thieves.


The Tennessee legislature recently passed a modification to the state privacy breach notification requirements, § 47-18-2107. The modification has been sent to the governor for signature. Unfortunately, the modification just confuses the law’s requirements.
The existing code says that a breach notification is required if “unauthorized acquisition of unencrypted computerized data” takes place. The breach also has to materially compromise the security, confidentiality, or integrity of personal information. This seems clear to me.
The new code says that notification is required when acquisition of computerized data that materially compromises the security, confidentiality, or integrity of personal information takes place. The data does not have to be unencrypted.
However, subsections add an exception for encrypted data. If the data breached is encrypted, breach notification is not triggered. One encryption exception is for data encrypted in accordance with FIPS 140-2, a Federal Information Processing Standard. I have never seen this used in private business. The second exception is for information that has been made “unusable”. On the face of it, this would seem to include any type of “encryption” processes, good or bad.
So, in the old (current) law, if you lost unencrypted data, you had to carry out notification. The new law seems to say that that’s still true, but if you have any reasonable encryption process, you have no duty to notify.


If you are like me, you have read through many articles and books on leadership. Most security professionals come with a technical background that does not directly facilitate leading people. But solutions aren’t easy to find, either. Many leadership training programs seem vague to me. What about “soft skills” vs. “hard skills”, Aristotle vs. Socrates, or the “art” of leadership? What are we to make of this? We are used to meeting compliance requirements and managing risk.
Presenting a book that I think fills a gap in leadership education: The Little Book of Leadership Development, by Scott Allen and Mitchell Kusy.
Allen and Kusy offer 50 ways to bring out the leader in every employee. Yes, a checklist of 50 items! The book’s focus is on how you can develop leadership skills in the employees you manage. But, in the process of doing this, you can develop your own leadership skills!
The book is divided into 5 sections. Each section contains 1-2 page tips on leadership. One section is devoted to you, the leader. The other four sections cover leadership topics you can share with your team. These four sections are: Skill Building; Conceptual Understanding; Personal Growth; and Feedback. My reading brings out quite a few valuable nuggets of information. So, highly recommended!


On my way into the office this morning, I listened to a podcast interview of a well-known SIEM vendor. I got more and more frustrated at the wheel, but did make it to the office without incident. The focus of this conversation was the plethora of log sources that this vendor could ingest—system, network, endpoint—and the machine learning used to analyze the data.
This is backwards. Good security designs need to start with the CUSTOMER. Yes, the customer. Who are the specific people that want information and what exactly do they want to see? Users could be audit, security operations, CISO, security analysts, developer, etc. Any other log files collected are irrelevant.
This approach is just lean thinking applied to security. Lean itself has been discussed in many books; I discussed it in the context of security here. The first lean principle is “voice of the customer”. SIEM tool design needs to run backwards, starting with the user interface, not the sources of data. Another lean principle is “systems thinking”, in other words how does the product or tool under discussion fit into the larger needs of protecting information. Virtually every security product discussion I am part of focuses only on that product’s small part of the assurance puzzle. I think CISO’s are getting tired of this and I hope vendors will take notice.

