Like last year many new parents, I walked an extremely tightrope to keep my young child healthy as well as happy. As my daughter moved out of infancy and became a much more aware toddler, I decided it was time to send her to preschool. It was better than her staring at the same four walls of the living room while I pondered the health risks over and over again. After a few web searches and a few phone calls, I chose one that was close to as well as seats were open (which was pretty hard to get). When I started the enrollment process, I saw a flyer in a huge package that immediately threw me into a new set of worries that I didn’t want to deal with: “We also use Brightwheelmobile app to record attendance, share milestones, and keep parents informed about daily interactions.”
I don’t know what other parents are thinking at this point, but I do privacy and security as my main job at the Electronic Frontier Foundation, so I couldn’t resist looking at Brightwheel’s security controls. gave me as a parent. It was my child’s data left behind by some company. Don’t get me wrong, the app provided some convenience, allowing me to see my child smile, make friends, and enjoy cycling while playing outdoors. Especially that first week when you’re not there to watch every aspect of their lives for the first time. But when I looked into my account, I saw very few settings that said anything about security. There was a PIN to verify their entry and exit, but that was it.
For several months I have been looking at the gigantic amount of data that is transferred and stored daily in this application. Diaper changes, photos during storytelling, bedtime, etc. The more data I saw about my daughter, the more my anxiety grew.
By October 2021, I couldn’t sit on it anymore. I wouldn’t call myself a hacker by the definition that most people have. But in this case, for my daughter’s sake, being a mother means doing everything in my power to keep her safe. So I started a months-long dive into the world of early education apps, and I didn’t like what I found.
I’m lucky where I work. A few cold emails and a bit of communication later, a colleague (also a new parent who was asked to use Brightwheel) and I finally met a real person at the company. The meeting was productive in the sense that Brightwheel seemed to understand the concerns, but acknowledged just how far behind the entire industry is in terms of protecting privacy and security.
For example, a very simple and well-known security measure is two-factor authentication. Do you know how some services now require you to enter a one-time code in addition to your password? This is two-factor authentication, which gives a huge return in terms of security. It’s spreading fast, and at least sentence it’s practically the industry standard these days.
Brightwheel now has two-factor authentication is available to all administrators of schools or kindergartens and parents, but this is the only company that has done this. This is bullshit.
Some of these companies do not disclose what data they collect or where it goes. And we found that in some cases they track and share information in the same way that Facebook does. That’s bad enough when it’s about adults on a public social networking site, but it’s terrible when it’s about a preschooler.
Figuring out privacy and security concerns about an app your child’s daycare is using is not like researching how to get your child to sleep or what highchair to use, where parents can easily find reliable sources of information. This information is not publicly available. Parents and administrators are sold for convenience, but they are not given even the most basic tools to choose a safe app.
And for those of us with the know-how to find these vulnerabilities and fix them, we are faced with the problem of companies not wanting to hear about it. As an ethical hacker, I planned I had to reveal what I found and wait 90 days for a response (common practice in the security industry). Even there I ran into roadblocks.
Apart from not finding a way to contact them on their websites, I found that researchers from Germany released a paper in March 2022 identifying security and privacy concerns with 42 preschool and kindergarten management apps. In addition to describing the vulnerabilities, the paper also explains that the researchers exercised due diligence in reporting ethical compliance issues and received little to no response from the companies.
This is unacceptable. If your company handles sensitive information and researchers are figuring out how to make your product safer for you, not responding to them is a terrible practice.
I published his own study of these applications on the EFF website.where you can get into the technical details, but the main takeaway is that these services are not as secure as they could or should be.
Some very basic requirements that we have for all of these companies are:
- Make two-factor authentication available to all administrators and staff.
- Fix known security vulnerabilities in mobile apps.
- Expand and list all tracking and analytics tools and how they are used.
- Use secure cloud server images. Also, put in place a process to continually update legacy technologies on these servers.
- Block all public cloud storage that hosts children’s videos and photos. They should not be public, and the kindergarten and parents should be the only ones who can access and see such sensitive data.
In addition, we would like these apps to become the standard for protecting any communication sent between schools and parents. End-to-end encryption is fine for this, and the server doesn’t need to see updates about the child’s life.
Finally, these companies need to monitor reports of problems with their applications and respond to them proactively. You don’t need a technologist working for a digital privacy organization and a colleague who happens to be a lawyer on the same issues to send out cold emails and work contacts to set up a meeting.
Being able to receive daily updates on how your child is doing in kindergarten is extremely comforting for parents. It was for me. Unfortunately, this consolation soon outweighed the danger I discovered.
Credit: www.wired.com /