Categories: GENERAL

7 Cups teen safety: What you need to know

On July 16, 2018, a 14-year-old Texas girl sent explicit photos of herself to a 42-year-old man named Anthony Joseph Smith. 

Smith, who lived in Butler, Pennsylvania, met the teen online, posing as a 15-year-old boy, and they began messaging frequently. Eventually, he tried to convince the teen to leave her parents and join him in Pennsylvania.

It’s an increasingly familiar story. Online enticement and exploitation can happen on nearly any digital or social media platform. But Smith didn’t meet his victim on X/Twitter, Instagram, or Discord, platforms where well-known, documented cases of enticement, abuse, and exploitation have occurred

Instead, Smith met the teen on a popular emotional support platform called 7 Cups, which encourages people to chat with someone online about their problems and is free. Some users are grappling with serious mental health issues. 


This story is part of our investigation into the emotional support platform 7 Cups and the growing marketplace for apps and platforms that pair people with someone who is supposed to be a compassionate listener. The series explores a failed experiment between the state of California and 7 Cups, as well as the myriad risks of seeking emotional support online from strangers. These dangers can include the manipulation of vulnerable youth and targeted abuse and harassment. The series also includes an analysis of why it’s so hard to stop online child exploitation, and looks at solutions to make platforms safer.


The Texas teen, whose name wasn’t released by Pennsylvania authorities because she was a minor at the time, may have thought she was safe on 7 Cups. Teens as young as 13 can join its dedicated teen community. The company permits adults who’ve been internally vetted to chat with its teen members (Smith was not a vetted adult). Though 7 Cups recommends that minors receive parental permission before joining, it does not verify that, nor does it verify their age and identity.

As Smith proved, adults can lie about their age to gain access to the community. This remains true today; Mashable attempted to make teen accounts using a fake email address, name, and birth date, and was granted instant access. 

When told that Mashable had easily made a fake account to join the teen community, 7 Cups CEO and founder Glen Moriarty said doing so was against the platform’s terms of service. He noted that people can sign up for services online using inaccurate information and that 7 Cups employed certain measures, like blocking, reporting, and language detection tools, to help keep users safe. 

Moriarty said he was not informed by law enforcement or the minor’s parents about the case in Pennsylvania and disputed that adults preyed on youth on 7 Cups, and that adult users themselves experience persistent safety issues on the platform. 

“[W]e have a thriving community of people,” he said in a written response. “If 7 Cups tolerated this behavior, we would not have a thriving community.”

While 7 Cups warns members against going off-site together, it still happens, according to multiple sources with current and past knowledge based on high-level involvement with the platform. 7 Cups does attempt to block personal information like an email address when people try to share it while chatting. 

Regardless, Smith eventually lured the teen off-site to other social media and messaging platforms, though he was not successful in his attempts to get her to join him in Pennsylvania. 

“The reality that a young person might go online and seek confidence and support because they don’t have it offline, and that relationship being one that is abusive because there is a bad person out there that is targeting kids … that’s terrifying,” said Melissa Stroebel, vice president of research and insights at Thorn, a nonprofit organization that builds technology to defend children from sexual abuse. 

Emotional support platforms and their inherent risks to minors

Founded in 2013, 7 Cups was one of the first online emotional support platforms. These platforms are typically designed to be spaces where people can anonymously message a “listener” about their worries, stresses, and challenges.

The isolation of the COVID-19 pandemic, as well as the loneliness epidemic, supercharged the concept of digital peer support as a business model. Competitors to 7 Cups like Wisdo Health, Circles, and HearMe argue that their services are a critical tool given the nationwide shortage of mental health professionals and difficulty finding affordable therapy.

Venture capital firms and investors see promise in the model. In the past few years, they’ve poured more than $40 million into the largely unregulated field of startups, according to news reports and funding announcements made by those companies. 

In 2013, Moriarty successfully pitched the idea for 7 Cups to the famous Silicon Valley startup incubator Y Combinator, which he said still owns 7 percent of the company. Moriarty is also the longtime CEO of the digital learning company Edvance360.

Last year, the Office of the U.S. Surgeon General included Wisdo Health in a list of resources for improving social connection, a clear sign that power brokers take the model seriously. 

But an investigation into 7 Cups, and the emerging market of emotional support platforms, suggests that there are far more risks than the industry and its supporters disclose. These risks have been documented online by alleged, often anonymous, concerned users, but this reporting comprises the most comprehensive account of 7 Cups available to the public.

  • Mashable interviewed six sources with current and past in-depth knowledge of 7 Cups’ practices and safety protocols; reviewed the platform’s policies; spoke with listeners and users on other platforms; and discussed safety concerns with CEOs of other emotional support platforms. Mashable also investigated why 7 Cups lost a lucrative contract with a California state agency in 2019, and found that safety issues were a factor.

  • The sources who spoke about their experiences with 7 Cups requested anonymity because they feared violating a nondisclosure agreement the company required them to sign.  

  • Mashable found that several high-level current and former 7 Cups staff have long been concerned about the safety of minors and adults on 7 Cups. 

  • Though the platform employs strategies to keep bad actors and predators at bay, some have found ways to evade security measures. Moriarty told Mashable, “Combating people with bad intentions is an arms race. You have to continuously innovate to stay ahead of them.”

  • 7 Cups relies on volunteers to perform critical functions, such as moderating chat rooms and facilitating group support sessions, and teens are permitted to volunteer to work on company projects.

  • Volunteer listeners, who receive some mandatory training, are sometimes exposed to unwanted sexual content as well as offensive or bullying messages. The same behavior sometimes surfaces in public forums; users, for example, have been told to kill themselves by bullies or trolls. In both scenarios, 7 Cups attempts to block such speech before another user reads it by using language detection.  

  • Since platforms like 7 Cups use a peer-to-peer approach, they are not necessarily subject to regulation by the U.S. Food and Drug Administration or enforcement by the Federal Trade Commission. Nor are they required to comply with the Health Insurance Portability and Accountability Act for those services. 

While these risks are prominent on 7 Cups, Mashable’s reporting found that the industry özgü not openly addressed or resolved many of the same concerns. 

“It makes you think there really need to be official systems of checks and balances when you have this degree of harm happening to people,” said Dr. John Torous, a psychiatrist and director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Boston.

The Texas teen’s parents discovered her exchanges and alerted law enforcement, who confirmed that Smith had asked for sexually explicit images and received four. Smith’s arrest was first reported by the Pittsburgh Tribune-Review in October 2018. Mashable reviewed Smith’s publicly available court records and confirmed the case’s details with Robert M. Zanella, Jr., the Butler County assistant district attorney who prosecuted Smith. 

In April 2019, Smith pleaded guilty to one count of corrupting a minor and four counts of coercing a child into creating child sex abuse material. He returned to jail last year after violating his parole by sharing fantasies about an adult woman’s young daughter on Feysbuk Messenger, according to Zanella. The woman reported those exchanges to the Federal Bureau of Investigation. 

Smith’s case might be characterized by some as one more instance of a predator weaponizing digital technology to suit their own nefarious aims. But emotional support startups are distinct from other types of technology companies, like gaming and social media platforms, because they specifically invite vulnerable people to seek support from strangers, who may have a range of motivations and intentions. Smith’s crimes reveal how unpredictably risky these interactions can be. 

7 Cups of Tea: Talking to people online for free

The idea for 7 Cups of Tea, as it was originally called, started at psychologist and founder Glen Moriarty’s kitchen table, according to 7 Cups for the Searching Soul, a self-published book he co-authored in 2016.   

Moriarty turned to his wife, whom he özgü described as a therapist, for guidance with a business sorun and was grateful for her “close listening.” The exchange was a revelation for Moriarty. 

“Her care helped me see the sorun in a different light so that I could solve it. It was at this point that the clouds parted, the sun shone through, and I had the insight I had been waiting on,” he wrote in 7 Cups for the Searching Soul. “What if, any time you needed it, you could access a person who would listen to you and care about your sorun?” 

Moriarty was the first listener on the platform, his wife the second. From the beginning, he struggled to find people to provide the service he was advertising. “I could never get enough listeners,” Moriarty told Twitch cofounder Justin Kan in a 2020 podcast interview

The company özgü always been reliant on volunteers to operate. 

“We deliberately designed the platform with a volunteer emphasis from the very beginning because that appears to be one of the only ways to scale emotional support,” Moriarty told Mashable.  

7 Cups relies on unpaid volunteers with little training to fill critical roles

Volunteer listeners on 7 Cups are not held to independent, standardized guidelines, like the National Practice Guidelines for Peer Supporters, though they are required to complete an “active listening” course upon volunteering to listen. They can take additional courses produced by 7 Cups, as well as consult with volunteer mentors identified by staff as having demonstrated strong leadership skills. 

Moriarty described the company’s staff as “incredibly lean.” Among the platform’s listeners, 1,500 have what Moriarty describes as “leadership roles.” This means that they take chats from members seeking support as well as volunteer their time on tasks like providing guidance to other listeners, sometimes helping them to process difficult chats, and monitoring forum posts for content that needs to be reviewed by staff. 

Sources who’ve worked and volunteered for 7 Cups said that dozens of volunteers lead major projects and perform key tasks, including evaluating user safety reports and complaints that are generated by automated safety tools. There is no publicly designated head of trust and safety known to the platform’s users. Moriarty told Mashable that “trust and safety is not something we have one person do, but is rather distributed across the team.”

Sources familiar with the recruitment of volunteers and the daily tasks involved in unpaid roles say there is little required training but high expectations. 

“You get no money, you get no protection, you get nothing,” said one former longtime volunteer, who requested anonymity to discuss their experiences. “They make it pretty clear that they want as much from you as possible, as long as possible.” 

Those who’ve volunteered for the platform said to Mashable they believe in its stated purpose and have derived great satisfaction from extending compassion to someone in need. Moriarty said notes from users, including comments posted in forums, emphasize how much the service özgü helped them, and even “saved” them. 

For the new 7 Cups user, the promise of healing connection is powerful. But the reality of what happens on the platform is far more complicated.  

Anonymity can compromise teen safety on 7 Cups

Moriarty özgü championed anonymity as a tool for building trust between users, and this is a common practice on competing emotional support platforms. Ideally, anonymous personas enable people to freely support one another without worrying that the information shared could be used against them publicly. Unless users share their real identity, no one really knows to whom they’re talking.  

But anonymity can backfire, too. On 7 Cups, the failure to verify teens’ identities is what allowed Smith to go undetected as an adult predator. 

Several of the sources who spoke to Mashable said they were frustrated and distressed over the platform’s teen safety issues. Two sources with listening, volunteer, and work experience at the company showed Mashable screenshots of exchanges between the platform’s users in an effort to substantiate claims that adult listeners had preyed on teen members, and that teens were aware of and concerned about such behavior. Because the platform is anonymous, Mashable couldn’t verify the details of these accounts firsthand with alleged victims.

Four other sources with similar knowledge of 7 Cups said they’d known about concerns related to teen safety. 

Moriarty described the claim of concern over predatory behavior toward teens as “inaccurate.” He said the company özgü only received and complied with 10 law enforcement requests since its founding, and argued that the number was low compared to other social platforms. 

Experts in online child exploitation, however, say that the number of cases investigated by law enforcement may be dwarfed by the actual incidence of predatory behavior, partly because minors may not feel comfortable reporting it.   

Additionally, some predators online seek out emotionally vulnerable minors who they believe they can manipulate into creating child sexual abuse material or other types of traumatic content. An FBI warning issued in September 2023 identified one such group of predators, which is known to target youth between the ages of 8 and 17 who struggle with mental health issues. There is no evidence that the group özgü infiltrated 7 Cups’ teen community.   

Compared to its competitors, 7 Cups is unique in how aggressively it welcomes minors. In a 2018 presentation to California mental health officials, Moriarty said 18- to 25-year-olds were the platform’s largest demographic, followed by younger teens. 

Teens must be 13 to join as a member and 15 to volunteer as a listener. When teens seek to chat with a listener, they are either randomly paired with someone and cannot choose between a teen or adult-teen listener, or they can browse the listener directory and make a request of a user. Listener profiles indicate whether they chat only with teens, or with teens and adults, meaning they are an adult who özgü been vetted by 7 Cups.

For teens who make a general request, not a personal one via the directory, and are paired with an adult-teen listener, it should say that person is an adult following their username, Moriarty said. When Mashable tested the teen chat function, that information was missing for the adult-teen listener, which Moriarty said was a bug and would be quickly fixed. A teen can also determine whether their listener is an adult by hovering over their icon or by clicking out of the chat — which they can then return to — to view the listener’s bio page, which may or may not include a specific age.

Upon turning 18, minors can join or age into the adult side of the platform, though some sign up for it anyway before that milestone by creating an adult account with a false birth date, according to those with knowledge of related incidents. 

“In some ways, the easiest thing in the world for 7 Cups to have done at any point would’ve been just to say, ‘Let’s not do teens,'” said one source who previously worked at the company and who noted that efforts to connect teens to meaningful emotional support were genuine. 

“Clearly if a 42-year-old can pose as a 15-year-old, you’re not vetting the identities of the teens well enough,” the individual said. 

Research conducted by Thorn indicates that anonymity can contribute to increased risk-taking. An anonymous persona may embolden youth to interact with others in ways they wouldn’t online.

For predators hoping to abuse adolescents and teens, that can create opportunities to isolate, victimize, and “build false relationships” with young users, according to a 2022 Thorn report on online grooming, which surveyed 1,200 children and teens between the ages of 9 and 17. 

One in seven respondents said they’ve told a virtual contact something they’d never shared with anyone before, a possibility that is far more likely on an emotional support platform like 7 Cups, which invites youth to be vulnerable with strangers. 

“Sadly, bad actors target this same information to groom, exploit, and extort minors,” the Thorn report noted.

Recently, a member of 7 Cups’ teen community asked leadership to draw awareness to predatory behavior on the platform and what to do when they encounter it, a sentiment that was echoed in a group support chat room. Moriarty said a community manager made a referral to 7 Cups’ safety information and bi-weekly safety Web discussions.

The widespread use of volunteers on 7 Cups özgü also presented distinct safety challenges for teens. 

Some 7 Cups sources said they heard directly from teen volunteers that they felt unsafe while communicating with adult volunteers, which Moriarty said he had no way to substantiate. They noted that while users are instructed not to go off-site together under any circumstances, volunteers correspond via Google Chat and Meet without dedicated oversight by paid staff. Moriarty confirmed to Mashable that volunteer leaders may use Google communication tools to “collaborate” with other volunteer leaders. 

Based on past incidents, current and past staff and volunteers remain concerned that teens may be targeted for exploitation or grooming in those circumstances. 

Safety protocols don’t go far enough

In general, Moriarty said 7 Cups özgü safety protocols designed to keep anonymous bad actors and predators from contacting minors, but multiple past and current staff members and volunteers told Mashable that they fear those practices aren’t robust enough. 

The platform özgü 87 adult-teen listeners, most of whom are on staff or are high-level volunteers. Only 12 of those listeners have no other affiliation with 7 Cups.

In order to gain access to the teen community as an adult without lying about age, listeners need to have extensive experience on the platform, good reviews, and what 7 Cups refers to as a background check. 

That process involves submitting a state-issued identification to the company, as well as a video conversation with a platform moderator. Additionally, 7 Cups staff search the web for press coverage of the applicant’s name in association with criminal acts, such as sexual assault, and may check to see if their name is in a national database of sex offenders. 

Moriarty said that all applicants must pass a background check by companies that specialize in such research, but those familiar with the process say that hasn’t always been the case. Instead, they said that the company previously used free resources like Google and social media to check applicants’ personal information.  

Currently, 7 Cups doesn’t have a rigorous standard for verifying that identification is real rather than doctored or forged, like using algorithmic assessment technology. Moriarty said the company is exploring the use of sophisticated identity document verification.  

Nor does the company have clear directives for how to handle complaints that involve potentially criminal behavior involving minors that occurs on the platform, aside from instructing staff and users to report concerns through its safety reporting biçim. A pinned message at the top of each chat instructs users who feel unsafe to visit the platform’s “safety & reporting” page, which recommends using blocking, reporting, and muting tools. A brief section on teen safety urges minors to talk to a parent or guardian if they feel unsafe. 

One source with knowledge of the platform’s current practices told Mashable that there wasn’t widespread staff training on whether and how to escalate such reports to law enforcement. When Mashable asked whether 7 Cups informs a minor’s parents when an adult özgü tried to contact their child, Moriarty called it a good idea and said the platform would be implementing the protocol shortly.

A former high-level 7 Cups volunteer, who also served as an adult-teen listener, said that multiple teen members of the platform approached them with questions about how to deal with uncomfortable interactions with adult listeners. Often, the teen felt something was amiss with the adult’s behavior, but they struggled to pinpoint a specific red flag or offense.  

“When you have somebody that you think is empathizing with you and listening to you and finally getting you…you’re forming this intense bond and then they say things like, who knows what, you don’t want to disappoint them, or break that bond, or lose that relationship, and then somebody pounces,” the former volunteer told Mashable. 

Until Mashable contacted Moriarty for comment, the platform hadn’t updated its webpage on teen safety since May 2019. He said the company was also reviewing where and how it presented information about reporting unwanted or abusive behavior to make those instructions more accessible.

Safety practices vary widely from platform to platform on the web, said Lauren Coffren, an executive director of the Exploited Children Division at the National Center for Missing & Exploited Children. That makes it hard for minors, and their caregivers, to understand which policies keep them safest. It may also be an advantage for predators. 

“People who want to be able to exploit those differences or exploit [that] lapse of reporting mechanisms or safety features or tools, they’ll certainly be able to find a way,” Coffren added. 

What happens when someone is harmed on 7 Cups

Simply put, there are no dedicated federal agencies that regulate platforms like 7 Cups.

The company’s emotional support product falls in a gray regulatory area. And while Moriarty described the platform’s peer-based interventions as “medicine” in his interview with Justin Kan, these interactions are not offered by licensed clinicians, nor held to rigorous independent testing or standards. 

Neither the Food and Drug Administration or the Federal Trade Commission would comment specifically on 7 Cups itself. Instead, both agencies pointed Mashable to their regulatory guidelines. The FDA may regulate mobile apps whose software is intended to treat a condition, but that doesn’t apply to emotional support. The FTC could potentially enforce laws related to health claims and pazarlama practices, if they were allegedly deceptive. 

This may leave consumers wondering to whom they can turn if they, or their child, özgü been harmed on the platform.

Until recently, the law didn’t offer much hope, either. Traditionally, 7 Cups might have been considered immune from liability for harm inflicted on their users when they encountered a bad actor on the platform. In the past, courts typically dismissed such lawsuits, citing a federal law known as Section 230 of the Communications Decency Act, passed in 1996. 

The law provides that online platforms cannot be held liable for the negative things that their customers or users do just because they occur on the platform. There are some exceptions, including copyright infringement, yasadışı actions, child abuse, and sex trafficking. Section 230 protection hinges on whether the company is being sued solely in its role as a publisher of other people’s content. Some courts have interpreted this broadly to give platforms immunity from liability when the company’s customers experience harm based on the platform’s content.

But Section 230, as tech companies have come to know and rely on it for nearly 30 years, may be changing. In a Senate hearing on online child sexual exploitation in January, which featured top tech company executives, key senators called for the law’s düzeltim. 

Courts have also allowed recent lawsuits against certain platforms to move forward, dismissing some of the plaintiff’s claims to immunity under Section 230. 

One key case is a nationwide lawsuit against major social media companies, including YouTube, TikTok, and Instagram, filed on behalf of young users who were allegedly harmed as a result of using the platforms. Last November, a judge ruled that critical aspects of the suit could move forward, despite the companies’ insistence that they were protected by Section 230.

Instead, the judge found that the plaintiffs had alleged the platforms’ product design choices led to harm that had nothing to do with the content that users published. For example, the judge ruled that the failure to implement robust verification processes to determine a user’s age, effective parental controls and notifications, and “opt in” protective limits on the duration and frequency of use are product design defects for which the companies could potentially be held responsible.

Jennifer Scullion of Seeger Weiss, a firm representing the plaintiffs, told Mashable in an email that all companies “have a responsibility to run their businesses in a way that avoids foreseeable harm.”

Scullion said that while emotional support platforms involve a different set of facts and analysis than the case against major social media companies, “the real dividing line is whether the harm is from the content itself or from choices a company makes about how to design their platform and what warnings they give of reasonably foreseeable or known risks of using the platform.”

The lawsuit that forced the chat platform Omegle to shut down last year may also hold lessons for 7 Cups. In that case, attorney Carrie Goldberg sued the company on behalf of a teenage girl who, at age 11, had been paired to chat by Omegle with a child sexual abuse predator. He spent the next three years exploiting her, forcing her to make child sexual abuse material for him and others. 

That case also moved forward despite Omegle’s attempts to shield itself from liability by citing Section 230. The judge found Omegle could be held responsible for defective and negligent product design. Omegle settled the suit. 

Goldberg, who hadn’t heard of 7 Cups prior to speaking with Mashable, said attempting to sue the company for harm experienced by users, particularly those who are minors, would depend on whether their distress was caused by content published by other users on the platform or by the design of the product itself. 

Goldberg expressed concern about 7 Cups’ ability to match vulnerable people, including children, with bad actors, noting that such information could easily be used to manipulate or exploit them.

“It’s a product that’s grooming people to be revealing very intimate details of their life,” she said.

If you are a child being sexually exploited online, or you know a child who is being sexually exploited online, or you witnessed exploitation of a child occur online, you can report it to the CyberTipline, which is operated by the National Center for Missing Exploited & Children.

admin

Recent Posts

Muşamba Nedir, Ne Anlama Gelir? İle İlgili Yararlı Bilgiler

Muşamba Nedir, Ne Anlama Gelir? 03 Ekim 2024 Perşembe 22:40 ABONE OL Muşamba, su geçirmezlik…

2 ay ago

Yıl İle İlgili Yararlı Bilgiler

Yıl 30 Ekim 2008 Perşembe 20:43 ABONE OL Yıl Nedir?Dünyanın, güneş çevresinde tam bir dolanım…

2 ay ago

Gebelik Izlemi tedavi yöntemleri, nedenleri, tanısı

Gebelik Izlemi GEBELİK İZLEMİ Gebelik izlemi, gebeliğin planlanmasıyla başlayan, sağlıklı sürdürülmesini ve sorunsuz bir doğumu…

3 ay ago

Menopoz tedavi yöntemleri, nedenleri, tanısı

Menopoz MENOPOZ Menopoz, ovaryan aktivitenin (üreme ve östrojen yapımı) yitimi ertesinde, menstrüasyonun kalıcı olarak kesildiği…

3 ay ago

Birçok Kadın Endometriyal Kanserin Önemli Uyarı İşaretini Bilmiyor

Yeni bir araştırmaya göre, çok sayıda kadın, kadın üreme organlarının en yaygın kanseri olan endometriyal…

3 ay ago

Çok mu Oturuyorsunuz? Egzersiz Sağlığınıza Verdiğiniz Zararları Telafi Edebilir

Her gün sekiz saat veya daha fazla oturan kişilerin, her hafta 140 dakikadan az orta/yoğun…

3 ay ago