If your organization isn’t already moving into the metaverse, it soon will be. Be warned: today’s security protocols and privacy laws may not apply to 3D worlds.
The metaverse is coming; businesses and government agencies are already building virtual worlds to support city services, meetings and conferences, community building, and commerce. They’re also rendering spatial apps around travel, car sales, manufacturing, and architecture in what Citi predicts will be a $13-trillion market with 5 billion users by 2030.
“Just as the internet, e-commerce, social media, smartphones, and remote computing have in the past two decades changed the ways companies operate and reach their employees and customers, organizations are now experimenting with the metaverse because they are seeing this as an extension of prior transformations,” says Cathy Barrera, founding economist of Prysm Group, which partners with the Wharton School in teaching executive education programs on metaverse business and blockchains.
New privacy and security issues will arise within these 3D worlds. As platform providers jostle for dominance, expect similar risks in the metaverse to those we’ve seen on social media such as phishing, pharming, impersonation, disinformation, and inroads for ransomware. There will also be new impacts on consumer privacy because the amount of rich and detailed data collected by these apps are juicy targets for criminals and marketers. “Metaverse technologies will require a great deal more data to be collected than is already collected in social media, such as how you’re turning your head and where your eyes are focused just to position displays correctly,” Barerra says.
New frontiers of deception
Social engineering-based crimes are already rampant in today’s internet 2.0. Ransomware operators use a good hook to get people to click links in emails and malicious ads are served up by Google and other search engines, over social media, and even through video conference and chat platforms.
Now consider the 3D immersive internet in which an avatar that looks like the boss or the boss’s boss asks an accounting exec to transfer money (a metaverse version of today’s BEC scams). Or imagine fraudsters hacking user accounts to break into development worlds and siphon intellectual property.
Some of these are already happening. Arkose Labs, an online account security and fraud prevention company, reported that in 2021, metaverse businesses faced 80% more bot attacks and 40% more human attacks than other online businesses. Built to bypass traditional defenses, these attacks focused on digital identity theft to carry out microtransaction fraud, spam, scams, and unfair competition.
While security experts point to authentication and access controls to protect against metaverse-based scams and attacks, the growing number of platforms providing access to the metaverse may or may not have secure mechanisms for recognizing frauds, says Paul Carlisle Kletchka, governance, risk, and compliance (GRC) analyst with Lynx Technology Partners, a provider of GRC services.
“One of the major vulnerabilities is the lack of standardized security protocols or mechanisms in place across the platforms,” he says. “As a result, cybercriminals can use the metaverse for a variety of purposes such as identity theft, fraud, or malicious attacks on other users. Since people can download programs and files from within the metaverse, there is also a risk that these files could contain malware that could infect a user’s computer or device and spread back into the organization’s systems. Another threat is piracy: since the metaverse is still in its early stages of development, there are no laws or regulations written specifically for the metaverse to protect intellectual property within this digital environment.”
Much more data to harvest and protect
This is why CISO’s and the businesses they support need to get in front of these new risks to their business and user data, says Michael Bruemmer, head of the Global Data Breach Resolution unit at Experian. He predicts that the growth of metaverses will open up new real estate for attacks. He also cites a lack of standards and regulations, comparing metaverses to the “Wild West.” At the very least, he points to weak authentication used in public metaverse platforms to encourage new users to sign up.
Bruemmer, who authored Experian’s tenth annual 2023 Data Breach Industry Forecast, also cites a lack of enforcement mechanisms for privacy violators, which goes hand in hand with a lack of regulation. “Look at Meta’s Oculus headsets or Microsoft’s investment in chatbot services. Consider what data they are collecting, whether it be username, password, credit card, device ID, pulse rate, movements, what you interact with in a cityscape environment, geolocation history—it’s all an unknown in terms of what regulations apply.”
Virtual reality specialist Louis Rosenberg explains in an Into the Metaverse podcast how this and other rich data could be easily exploited to influence buyers and increase polarization like that we are currently seeing on social media platforms. An AI-enabled marketing chatbot masquerading as just another person in a virtual world could be telling a potential consumer about a cool new car they bought. This form of predatory deception can go miles farther than in today’s social platforms by using intelligent algorithms to monitor the target’s speaking style, facial expressions, pulse rates, blood pressure, and heart rate so it can apply “ultimate persuasion,” he said in the podcast.
Yon Raz-Fridman, host of Into the Metaverse and founding CEO of Supersocial, a builder of virtual worlds, says his company develops business solutions on the Roblox gaming platform because of Roblox’s long history and experience building privacy and security into its platform. He says his company helps his clients create their virtual worlds to nurture communities and awareness around their brand and products. For example, Supersocial engineers and designers created the Nars Color Quest for the Nars cosmetics brand, which became the number one beauty experience on the Roblox platform
“The big advantage of building on the Roblox platform is that it’s relatively safe and stable. When clients ask about privacy and safety, we provide them with the best practices of the platform so they will fully understand some of the potential risks and how they are mitigated by the platform. We don’t own the platform, so we lean on the safety and policies outlined and managed by Roblox,” Raz-Fridman says.
3D regulations will differ from 2D
While graphical and immersive, most of today’s metaverse experiences are still two-dimensional. But Experian’s Bruemmer predicts that 2023 will become the year of headset-enabled artificial reality (AR) and virtual reality (VR), to which today’s regulations won’t apply. But privacy attorney Liz Harding says that newer laws such as GDPR may provide at least some guidelines, particularly in global worlds.
Harding, who is the technology transactions and data privacy vice chair at the Polsinelli law firm and is qualified in both the UK and the US says that “with metaverse technologies, there are big questions around jurisdiction. Say that I’m in the US, and I have a colleague in Germany and we’re meeting in the metaverse and data is being collected or the meeting is recorded. It will be hard to make the argument that the laws from where the platform is hosted are the only laws that apply, particularly if you are knowingly bringing people from different jurisdictions into those interactions.”
Tracking where those people are physically located and collecting their precise location data to try to comply with international laws, could trigger a violation if appropriate compliance measures (such as securing appropriate consent) aren’t taken, Harding says. Then there’s the question of what type of community is presenting what type of data. Medical, HR, and other sensitive data collection will trigger additional privacy compliance obligations.
Focus on current best practices
Ready or not, Gartner predicts that metaverses will have a profound impact on employee experiences by 2030, covering everything from employee-to-consumer transactions, learning, procurement, employee onboarding, collaboration activities, and virtual office spaces, to name a few. Some of these will be purpose-built “mini-verses” while others will involve large-scale shared platforms. Platform providers including Meta, Microsoft, Apple, Sony, Amazon AWS, Google, NVIDIA Omniverse, and Epic Games are currently pumping billions of dollars into platforms and headsets to dominate this new market.
To protect users and data in this emerging virtual frontier, Globant’s technical director, Pablo Lecea, suggests focusing on best practices already used today. Globant has been helping businesses create metaverse experiences for 15 years, utilizing threat modeling, secure development, encryption, authentication, verification, secure data collection, and storage policies that align with current laws. Among its many engineering services, it also provides cybersecurity services for its clients.
For CISO resources, Lecea points to the Future of Privacy Forum, which advocates for stronger policy and controls to protect sensory, audio, and biometric information derived from VR devices. “According to the Future of Privacy Forum, a twenty-minute virtual reality session could collect over two million unique data points per user, while a traditional social media session collects fifty-five-thousand data points per user,” he notes. “This data must be protected, so having a security framework for developing these applications is critical.”
Read the original article here.