The app economy, both at home and abroad, is one of the most remarkable and transformative things in recent years and one that people largely take for granted. Just think, a few years ago it would have been inconceivable that you would turn on your phone, activate the GPS, and have a car driven by a complete stranger take you to your destination, but that’s Uber.

What about all of those amazing vacation photos that used to sit idle in boxes until relatives came over? Now you can share them on Instagram with the tap of a finger for all the world to see. What about that snarky joke that occurred to you on your drive into work? You can share that through Twitter for all of your “fans” to hear and share. You can surf the web, trade stocks, manage your money, track your fitness and more all through apps on your smart phone. It’s so ubiquitous that it is almost a given.

The scale of the app economy in the United States is huge — in 2018 alone, it generated $120 billion in gross annual revenue for app stores, according to the analytics firm App Annie. Deloitte indicated that there were more than 317,000 companies active in mobile app development last year, too. Clearly apps are very big business.

The problem is that while there are untold numbers of honest and aboveboard app developers, there are an equal number (and probably a whole lot more) trying to develop a different type of “killer app” — one that could compromise your phone or device, steal your information, spam you, or use your device as a way to get into more important and secure networks.

Both the Google Play Store and Apple’s App Store have rules and vetting processes before the apps are allowed to live on the sites. This is, however, a losing proposition. Nearly every day apps are approved that circumvent the rules, change their behavior after they are uploaded, or poach intellectual property directly or indirectly. And a number of third-party app stores and sites have vastly looser rules, if any rules at all, to screen apps before they are uploaded.

There are other apps that are just malicious vehicles to access your devices and steal your information. Upon downloading some apps, the program will ask you for permission to nearly every element of your phone, including text messaging, contacts, email and camera. If you aren’t savvy, you’ll allow the criminals free access to the most personal of devices.

So what are we to do? We need to build out a digital privacy ecosystem of security whereby products and apps are thoroughly tested and vetted independently before being fielded. What might this look like? We see a highly effective system in place today for electrical products. Underwriters Laboratory vets and certifies massive numbers of products as having met its standards. We need to develop a similar approval label system for apps.

Apple’s App Store requires some security reviews, for example, but these also could be expanded to include a privacy and cybersecurity vulnerability report card. If an app doesn’t meet the highest standards, it shouldn’t be sold on the App Store or in the Google Play Store.

Privacy and security shouldn’t be an afterthought, but rather at the core of app design. We can do this in part with the aforementioned independent vetting and report card approaches. We also need to change the mindset of app developers. Long term, we need to bake digital privacy into the educational curriculum of students at all levels, as they go on to develop new apps in schools and beyond.

Unless we get a handle on this situation soon, the next killer app could well be the app that kills your privacy.