video

Crossing the Compliance Chasm

There is a wide gap between regulatory compliance mandates and practical implementation and enforcement that I like to call the “Compliance Chasm”. That chasm is defined by the activity to protect consumers and consideration for the economic and operational impact on business enterprises. Finding that balance requires thought, not the more popular whack-a-mole enterprise strategy that reacts to new compliance mandates.

The frequency and size of regulatory fines are rising for non-compliance. In January 2023, Meta was fined $418 million for GDPR violations by Meta properties’ Facebook and Instagram. Ireland’s Data Protection Commission follows up in May that same year with a $1.3 billion fine for additional violations. And those were just the latest fines imposed on web giants, that also included Google and Amazon.

The targets of those fines might be justified in saying compliance is an impossible task. By 2025 the volume of data/information created, captured, copied, and consumed worldwide is forecast to reach 181 zettabytes. Nearly 80% of companies estimate that 50%-90% of their data is unstructured text, video, audio, web server logs, or social media activities.

Read more...

Election security is not a technology problem. It is how naive we are

When it comes to election security, the technology we use to vote and count those votes is not the problem. The problem is how naive we are.

Election security has been at the forefront of daily news cycles for more a decade. The concerns about illicit use of technology to input and count the votes turned out to be largely overblown. Every U.S. state other than the Commonwealth of Louisiana, uses paper ballots, matching the practice of every other western democracy. Lawsuits have bankrupted people and organizations claiming the technology was changing votes. Those that have complained the loudest about election interference are now facing prosecution for the crimes.

Now the tech focus is on the use of artificial Intelligence to create deepfake video and audio. A recent pitch from Surfshark,

Membership Required

You must be a member to access this content.

View Membership Levels

Already a member? Log in here
Read more...

RSAC Reporter’s Notebook: Change is coming

The cybersecurity industry is just absolute chaos, and rightly so.  This is the industry charged with plugging dikes during the Class-5 hurricane that the internet seems to be today.  Nowhere is that chaos more evident than at RSAC just from a marketing perspective. Everyone has “ground-breaking”, “industry-leading”, and “first ever” product offerings and this year was no different.  But if you can look past the Macho-man impersonations, Formula One cars, and the mesmerizing miasma of the website and show floor, you can see an order forming in the chaos. Change is coming.

Back to step one

RSA CEO Rohit Ghai, said we have missed a step in AI development.  “We’ve seen it first as a co-pilot alongside of a human pilot and then see it taking over flying the plane.”  He said the first step is making it an advanced cockpit making it easier for less trained and experienced people to do the work.  He pointed out that cybersecurity is an industry with negative employment making it difficult to find experienced technicians to do the work.

Last year, any discussion of ethical development was met with confused stares. This year, the need for ethical AI development is taken seriously but few can see a profit in it. Cybersecurity VC Rob Ackerman (DataTribe) and Carmen Marsh, CEO of the United Cybersecurity Alliance, were open to suggestions,

“From the perspective of (companies like OpenAI), I understand the reasons to go as fast as they can to develop a true artificial intelligence, the question is, who are the people in the room guiding the process?” said Ackerman. “Once you get a diverse set of advisors working on the problem, then you do the best you can to create something ethical.  But right now, we aren’t even doing the best we can.”

Membership Required

You must be a member to access this content.

View Membership Levels

Already a member? Log in here
Read more...

Social media hangs itself in TikTok legislation

The debate over the appropriateness of the Congressional action against TikTok can be debated for a long time and probably will until the Senate takes action—which could be weeks. What is less debatable is TikTok’s, and pretty much all of the social media industry’s contribution to the situation. In essence, social media has hung itself with its own lifeline.

The industry has long embraced Section 230, a section of Title 47 of the United States Code that classifies them as part of the telecommunications industry. That particular law immunizes social media platforms and users from legal liability for online information provided by third parties. The section also protects web hosts from liability for voluntarily and in good faith editing or restricting access to objectionable material, even if the material is constitutionally protected. These protections do not apply to what is traditionally known as “the media.” That is an important distinction.

The FCC also regulates related to the foreign ownership of telecommunications companies, broadcast, and cable companies, in that it is not allowed. If TikTok expects protection under Section 230, it has to abide by all the FCC regulations, including ownership. In that case, the legislation is consistent with US law.

News media or Telecom?

However, the CEO of TikTok has made the case that the legislation infringes on the First Amendment rights of the company, creators, and users because… wait for it … TikTok is a major source of news for users. In other words, it is a news medium. According to TikTok, 43 percent of users rely on the app for daily news. But that sets up an entirely different problem.

Print, broadcast, and cable media are bound by ethics and laws to print truth. If they knowingly publish defamatory and untrue information, they can be sued by the injured party. That was most recently and famously demonstrated in the lawsuits against Fox News and Rudy Guiliani for intentionally spreading lies about election technology related to the 2020 US election.

Those same lies were and still are spread on social media platforms, including TikTok, with impunity under the protection of Section 230. But if they are a news medium, the protections of Section 230 go away and TikTok and creators who spread disinformation can now be held accountable for libel and slander.
Social media companies can adjust algorithms limiting what kind of information can be distributed on their networks and they reluctantly apply those restrictions when they are pushed to. But they can’t be sued for disseminating that information under Section 230. If they

Purchase Required

This content requires that you purchase additional access. The price is $1.00 or free for our Premium members.

Purchase this Content ($1.00) Choose a Membership Level

Already a member? Log in here
Read more...

Like Digital Cicadas, Cybercriminals Lie In Wait Before Unleashing Their Presence

A curious parallel can be drawn between cybercriminals and the intriguing phenomenon of Cicadas. Akin to the periodic insects that emerge from the ground after years of dormancy, cybercriminals often resurface with renewed vigor, unleashing their disruptive activities on unsuspecting organizations.

Purchase Required

This content requires that you purchase additional access. The price is $1.00 or free for our Premium members.

Purchase this Content ($1.00) Choose a Membership Level

Already a member? Log in here
Read more...