By Oscar Lewis on Wednesday, 16 August 2023
Category: Politics

How Dating Apps Became a Paradise for Predators

Matthew Herrick’s dating profiles depicted a muscular man with olive-green eyes and red-hot bedroom interests. The aspiring actor liked orgies and bondage. He was HIV-positive but preferred unprotected sex. He had a “rape fantasy,” and if he seemed resistant to the advances of would-be partners, it was simply part of what got him off.

Between October 2016 and March 2017, more than 1,100 men accepted invitations to meet Herrick in person. They showed up at his West Harlem apartment in droves, and on several occasions broke into the building. They followed Herrick into public restrooms and approached him at the midtown Manhattan restaurant where he worked as a server, because that’s what they had been instructed to do.

But Herrick had not matched with any of these men online. His dating profiles—and practically everything in them—were bogus. His vengeful ex-boyfriend, Oscar Juan Carlos Gutierrez (whom Herrick had met on a dating app), had created the accounts. He was able to impersonate Herrick because, court documents show, the apps didn’t require him to take any steps to verify his identity.

Herrick eventually got an order of protection against Gutierrez, but the torrent of uninvited houseguests continued. It took almost a year for Gutierrez to be arrested on criminal charges. He was sentenced to prison for criminal contempt, identity theft, and stalking in November 2019.

By that time, the nightmarish experience had taken a severe toll on Herrick’s mental health. “I had extreme insomnia. I was scared to leave my apartment,” Herrick, now 38, recalls. “I was drinking heavily to try to get some sleep,” he adds. “There was a moment where I thought, ‘Do I take my own life, or do I continue fighting? Because I can’t do this anymore.’”

The ex-boyfriend didn’t act alone in making Herrick’s life a living hell. Gutierrez weaponized Grindr and other dating apps to carry out his abuse. According to court records, Grindr ignored roughly 100 pleas from Herrick and his loved ones to remove the fraudulent profiles and block Gutierrez from making new ones. (The company disputes it ignored Herrick’s complaints and claims it worked with the police on his case. Two other gay dating apps took swift action to remove fake profiles, Herrick says.)

In 2017, as Herrick pursued criminal charges against Gutierrez, he also sued Grindr. He wanted the company to disable the bogus profiles, but he also sought to hold Grindr liable for both facilitating and failing to stop the harassment against him. But his lawsuit soon hit a brick wall thanks to what Herrick’s lawyer Carrie Goldberg calls the tech industry’s “get-out-of-jail-free card”—Section 230 of the Communications Decency Act. “The whole thing is horrible,” noted a judge for a federal appeals court, which upheld a lower court’s ruling to toss Herrick’s case. “But the question is, what’s the responsibility of Grindr?”

None, the judge concluded. Since Section 230’s passage in 1996, courts have interpreted the key 26 words of the statute to offer broad legal immunity to internet platforms that host user-generated content, treating them differently than traditional publishers, which are on the hook for the material that they broadcast or print.

Section 230’s protections act as a foundation for most of the modern-day internet. TikTok invoked the statute in a legal filing claiming it is not responsible for a viral trend that encouraged users to choke themselves until they lost consciousness (some died). Snapchat has cited 230 to dismiss a lawsuit contending it is responsible for fentanyl deaths resulting from drug deals brokered on its platform. Omegle, a website that randomly pairs users for private video chats, argued in a 2022 court document that the statute immunized it from a lawsuit alleging it was liable for allowing an adult man to extract pornographic imagery from an 11-year-old girl over the span of three years. (Goldberg also represents the plaintiff in the Omegle case, which Mother Jones covered last November.)

Historically, lawsuits targeting tech companies over user-generated posts have been thrown out on 230 grounds before they even advanced to discovery. In this sense, “the judicial prong of our government almost doesn’t have the reach to include these companies,” Goldberg says.

Internet behemoths spend big money to favorably shape the regulatory landscape. Over the last four years, Twitter and the parent companies of Facebook and Google dished out about $100 million to lobby the federal government on internet policy, including on bills that would impact Section 230’s reach. Match Group—the parent company of Hinge, Tinder, Match.com, Plenty of Fish, and several other dating platforms—spent $1.45 million on lobbying in 2022 alone. It was a 13 percent increase from 2020, and the 10th-highest spending by any company advocating on internet issues.

Absent Section 230 reform, online platforms have little impetus to police themselves or verify users are who they say they are. And while many platforms carry the routine hazards of deception, fraud, and harassment, dating apps, which exist to facilitate in-person meetups and hookups between strangers, also come with the potential for sexual and physical abuse.

Herrick feels he was one of the lucky ones. His abuse, while soul-crushing, never resulted in serious physical harm. “There are people that don’t survive,” Herrick says. “People are murdered or raped or beaten…and nobody’s held accountable.”

Julie Valentine has treated hundreds of rape victims over her 17-year career as a forensic nurse. Starting around 2014, Valentine and her colleagues noticed something was changing about these patients. Increasingly, they mentioned that they had met their assailants on dating apps.

Valentine and a team of researchers began analyzing the forensic records of sexual assault victims in Utah between 2017 and 2020. “What we found was really profound,” says Valentine, an associate dean at Brigham Young University’s nursing school. “And pretty terrifying.”

Their research showed 8 percent of the victims had been assaulted during an initial meetup arranged through a dating app. It’s not an outlier. A similar report from the UK found that roughly 1 in 10 victims of serious sexual crimes met their attackers on dating apps; 47 percent of suspects reported to the UK authorities for sexual offenses facilitated through online dating had previous convictions. Valentine believes the real percentage of survivors who connect with their assailants on dating apps could be even higher, since many sexual assaults are never reported. Valentine’s study was also limited by the type of information requested on hospital examination forms, which usually don’t identify dating app–linked rapes that occurred after the first date.

Yet her findings were startling, despite the limitations of the data: Even though the major dating apps require users to be 18 and over, Valentine’s team found that many survivors were minors. Further, the injuries the app victims sustained were more severe: 32 percent were strangled, versus 22 percent of non–dating app victims. Dating app victims were nearly twice as likely to have injuries to their breasts, and they had an 11 percent higher rate of injuries to their genitals. Sixty percent reported they struggled with mental illness. “Violent predators,” Valentine says, “use these dating apps as hunting grounds for vulnerable victims.”

Petra Eriksson

On the afternoon of September 20, 2022, the mother of a Nova Scotia teen emailed Carrie Goldberg with a harrowing story of abuse: “Lawsuite [sic] against Grindr,” the subject line read. Fifteen years old, closeted, autistic—her son fit the definition of vulnerable. Over the span of four days in April 2019, the mother told Goldberg, he was sexually assaulted by four separate Grindr users, who ranged in age from 21 to at least 56. The boy, identified as John Doe in court records, was so traumatized by these encounters that he dropped out of high school, attempted suicide, and was admitted to a psychiatric facility.

Back in 2014, after an ex threatened to send revenge porn to her professional colleagues, Goldberg had started a boutique firm dedicated to representing targets of online abuse. She was an obvious choice for Doe’s case—few other attorneys would touch lawsuits that would inevitably involve 230. Even so, Goldberg hadn’t envisioned battling Grindr again anytime soon. She had been gutted by the outcome of Herrick’s case, which had been both “financially costly and lonely, because even my closest allies in online civil rights refused to support my product liability legal theory.”

In Herrick’s suit, she had attempted to circumvent 230’s immunity shield by approaching the case as a lawyer would the manufacturer of a malfunctioning child car seat or any other faulty product. She argued that the app was defectively designed because it was so easy to manipulate and had few safeguards to protect users such as Herrick. That argument did not sway the judges in Herrick’s case, but in the years afterward Goldberg felt it was gaining traction. One sign came in 2021 when the US Court of Appeals for the Ninth Circuit ruled that Section 230 did not shield Snapchat from a suit over a photo filter that measured speed, which three Wisconsin boys were using when their car collided with a tree at 113 miles per hour. The company could have taken “reasonable measures to design a product more useful than it was foreseeably dangerous,” the court said. A personal victory came in July 2022, when Goldberg persuaded a district judge that Section 230 did not immunize Omegle from the claim that its product design harmed a young girl.

Some experts remain dubious of the product liability strategy Goldberg has pursued. “It would be extraordinary for online dating apps to fix problems in social interactions that have existed since people have been social,” says Eric Goldman, an internet law scholar at Santa Clara University School of Law. “Any legal standard that requires services to achieve that is actually not possible, because the dating apps can’t fix humans.”

However, Leah Plunkett, a faculty associate with the Berkman Klein Center for Internet & Society at Harvard University, says society needs to “get past the false belief that everything a social media platform does is necessarily publishing or speaking,” and that the product liability angle is one component of a “real turning point, maybe even a tipping point in the ways in which social media platforms are going to be regulated.” She adds that cases involving minors are the first frontier of this changing landscape.

Goldberg responded to the email Doe’s mom sent within five minutes, and ultimately took the case. She once again sued Grindr, arguing that the app facilitated her client’s connection with the four men who assaulted him. Though Grindr’s terms of service state that users must be 18 or older, the company only verifies the ages of users it suspects are underage, usually because other accounts have reported them. Grindr said it blocked or banned “tens of thousands of suspected underage accounts per year.” But this mechanism of self-policing is spotty at best. Grindr never asked Doe to prove his age, according to court records. And he’s not alone. A 2018 Northwestern University study found that half of the underage, sexually active gay and bisexual teens it surveyed were active on dating apps, including Grindr.

Goldberg contends that Grindr doesn’t just enable underage use—the company encourages it. Citing one promotional video featuring two young people wearing backpacks and seemingly using Grindr outside of a high school and another video showing a student in gym class, Goldberg’s suit alleges the company “markets the Grindr App towards children.”

Grindr rejects the allegation. “We have no reason to want underage users on our platform,” says Patrick Lenihan, the company’s head of global communications. But he also suggests that taking more robust measures, like ID verification for all users, would create privacy risks, especially for those who are not out as gay. Grindr, which operates in dozens of countries, also worries about storing data like IDs in the event governments hostile toward gay rights compel them to turn over user information. “Our users need two things,” Lenihan says. “They need privacy and they need safety. And those things are two sides of the same coin. But they come as perfectly diametric trade-offs.”

But in this compromise, privacy has a clear edge. Because he wasn’t required to verify his identity, John Doe, a child, could access the app by punching in a phony birthdate. Grindr’s lax requirements may have also made it hard to identify one of Doe’s rapists, who used a fake name. While three of the assailants were convicted of sexual crimes against Doe, the fourth perpetrator “remains at large,” Goldberg’s suit says, “and a risk to the children of Nova Scotia.”

At its best, dating can be awkward and nerve-wracking. It can require making yourself emotionally and perhaps physically vulnerable to a person you are just getting to know. Apps have the potential to make dating easier and more fun, by helping people with common interests find each other. Apps could also, in theory, make dating safer, by letting users learn about their dates before even meeting.

But the opposite was true in my experience, even as a reporter who is trained to investigate things. When I connected with a guy on Match Group’s Hinge app during the summer of 2021, I did my typical due diligence, confirming his listed name and job were legit. Soon, we started dating, casually at first, before deciding to make things exclusive. Three months into our relationship, my roommate noticed that he had dropped his passport on our living room floor. When she looked inside, she was stunned to see it listed a date of birth six years older than the age on his profile, making him 11 years my senior. The discrepancy was one factor in my decision to end the relationship. Months later, I was contacted by a woman who had found romantic texts the man had sent me while supposedly in an “exclusive” relationship with her. He had also told her he was younger than the age listed on his passport.

I joked to friends that the experience was an online dating rite of passage. But privately, I felt ashamed that I was so easily manipulated. Reluctantly, I returned to the apps months later and found his profile with a still-inaccurate age. I reported him to Hinge for violating the company’s terms of service, which prohibit users from misrepresenting themselves. Hinge never replied, and he was able to continue meeting women online: Someone later posted his profile photo to a local Facebook page dedicated to making dating safer for women in the Washington, DC, metro area, where I live. A warning accompanied the screenshot: He is a “serial cheater and pathological liar…. Do yourself a favor and run.”

That Facebook page is one of more than 100 similarly purposed internet sleuthing groups that exist across the nation. Millions of users have joined to share profile pics and first names of potential dates: Often, the comments under a photo are rather innocuous. Perhaps a guy said he was 6 feet but he was really 5 feet 10 or a woman wouldn’t stop talking about her ex on a first date. These communities allow people to share such experiences, giving users the opportunity to decide for themselves if the person is still worthy of their time. They are also inherently flawed: allowing aggrieved people to post defamatory information tied to a person’s photo without that person having any opportunity to defend themself.

But since dating apps aren’t usually confirming identities, let alone screening for criminal histories, the pages are one of the few ways people on the dating market can try, perhaps futilely, to protect themselves from truly dangerous people. Last year, for instance, the DC Facebook page featured screenshots of a dating profile belonging to a guy who had fatally stabbed a store manager more than 50 times in nearby Alexandria, Virginia, because he thought the manager was a werewolf. That man was found not guilty by reason of insanity in 2019 and spent three years in a court-ordered mental health facility. But upon his release, his dating app profile simply said he was “getting back from two years of travel.” (He was subsequently ordered by a court to stay off social networking sites other than LinkedIn.)

In 2020, when 21-year-old Laura encountered a criminal on dating apps, she took to social media to warn other women. A tall and adventurous 22-year-old named Christian had repeatedly matched with her and a friend on Bumble and Tinder in the San Francisco Bay Area. When the two women rejected Christian’s aggressive advances, which included pressing one of them for her address, he called Laura a “weak bitch” and her friend a “fucking flake.”

Suspicious, the friends researched his phone number and discovered that Christian was not his real name. Nor was he 22. Public records showed his phone number was tied to a 55-year-old registered sex offender. Laura says Bumble quickly responded to her pleas and took down his profile, but Tinder, owned by Match Group, did not reply when she reported him. Match Group didn’t answer my questions about Laura’s experience. The company has previously said that it’s hard to keep sex predators off its apps, admitting to ProPublica in 2019 that “there are definitely registered sex offenders on our free products.” (The company said later in a statement: “We do not tolerate sex offenders on our site.”)

A recent story out of Wisconsin had a similar beginning but a far worse ending. A police report I obtained alleges that last September, a woman in the suburbs of Milwaukee went on a date with a guy named Timothy L. Olson, whom she met on Match.com under the fake name “Tim Wilson.” The next day, she woke up confused. She couldn’t find her car and discovered $800 worth of fraudulent charges on her debit card. Records indicate the woman may have been drugged.

As police searched for Olson last fall, a 60-year-old psychology professor named Daun Kihslinger was found dead in a home belonging to Olson’s mother. Kihslinger’s family told a local CBS news affiliate they believe she “died from drugs given to her by a man she met on a dating app.” A spokesperson for the Racine, Wisconsin, police department says Olson remains a “person of interest” in Kihslinger’s death.

“There are people that don’t survive. People are murdered or raped or beaten…and nobody’s held accountable.”

Later that month, 55-year-old Kim Mikulance, a mother of four, collapsed at a bar while sitting with Olson. He fled the scene, and she died a few days later in the hospital due to still-undetermined causes. In late June, South Milwaukee Police Chief William Jessup told Mother Jones Olson is still a person of interest in Mikulance’s death.

As of June, court records show, Olson was in police custody on charges related to the kidnapping of an elderly woman. But his rap sheet includes at least one other online-­dating incident. In 2009, he was sentenced to more than five years in prison after stealing thousands of dollars from a woman he met in 2006 on Match.com, also under a fictitious name. “I just can’t believe this is going on 16 years later,” his victim told a local ABC news station in Milwaukee after Olson was arrested in late 2022. “He’s just going to keep getting out and keep doing this.”

While Olson is now under investigation for more serious crimes, he started out using online dating sites to steal from women. That’s incredibly common: In 2022, Americans reported losing a whopping $1.3 billion to romance scams, according to the Federal Trade Commission, and dating apps and websites were a frequent source. Between 2013 and mid-2018, “as many as 25-30 percent of Match.com members who registered each day were using Match.com to perpetrate scams,” the FTC alleged in a 2019 case against Match Group. (The company contended at the time that the agency “relied on cherry-­picked data” and said it blocks “96% of bots and fake accounts from our site within a day.”)

Nevertheless, a federal judge ruled in March 2022 that the company was not responsible for users’ financial losses, no matter how many scammers were prowling its website. Citing Section 230, the judge wrote that Match Group was “acting as a publisher of third-party generated content” and was therefore “entitled to immunity.”

Petra Eriksson

In 2017, author and cybersecurity law professor Jeff Kosseff asked Sen. Ron Wyden (D-Ore.), who co-wrote Section 230 when he was a member of the House of Representatives, to reflect on its legacy. “I always thought the bill was going to be useful,” Wyden said. “But I never thought its reach would be this dramatic.” (Wyden’s office declined to comment for this story.)

The 1996 statute was a reaction to divergent opinions in defamation lawsuits targeting early internet platforms. Five years before, a federal court had dismissed a case against CompuServe for hosting allegedly defamatory comments about a company called Cubby Inc. But in 1995, another court ruled in favor of brokerage firm Stratton Oakmont, which had sued the website Prodigy over comments posted by an anonymous user that alleged the financial firm was engaging in fraud. (In fact, Stratton Oakmont’s executives were later indicted for fraud. The company’s scandals inspired The Wolf of Wall Street.)

Then-Reps. Christopher Cox (R-Calif.) and Wyden responded by introducing Section 230, establishing that interactive platforms should not be treated like newspapers and other publishers, even if they engage in some good-faith content moderation. In the years since, the tech world has viewed defending Section 230 as an existential issue. Match Group, for example, argued in an amicus brief it submitted in Matthew Herrick’s case against Grindr that companies like theirs might “cease to exist altogether” without it.

Match Group made similar points in a brief it submitted in defense of Google, which was sued by the family of Nohemi Gonzalez, one of 130 people slain during a series of 2015 terrorist attacks in Paris. The Gonzalez family partially blamed the death of their 23-year-old daughter, who was studying abroad, on YouTube, alleging that the Google subsidiary indoctrinated terrorists by feeding them a stream of ISIS videos through YouTube’s personalized algorithm. The algorithm is not unlike those used by many dating apps, which recommend certain users to others based on a variety of factors, including previous matches. “Section 230 is vital” to Match Group’s efforts to “provide recommendations to its users for potential matches without having to fear overwhelming litigation,” the brief said.

The US Supreme Court heard arguments in the case in February, marking the first time the nation’s highest judges had considered Section 230’s scope. But the justices seemed squeamish about wading into complex tech issues. Elena Kagan acknowledged she and her colleagues were “not the nine greatest experts on the internet.” Brett Kavanaugh, citing the amicus briefs submitted by major tech groups that argued reinterpreting Section 230 “would create a lot of economic dislocation” and possibly “crash the digital economy,” noted that the justices “are not equipped” to account for the economic fallout of redefining Section 230. In May, the court rendered a decision that sidestepped the role of Section 230 in the case, but effectively sided with Google.

If the courts won’t touch platform immunity (at least for now), that leaves the deeply divided Congress. In fact, Section 230 is a rare policy issue in which lawmakers on both sides of the aisle agree that changes are needed. They just can’t agree on what those changes should look like and how far they should go.

The SAFE TECH Act, sponsored by Sen. Mark Warner (D-Va.) and first introduced in 2021, would allow victims of online harm to seek court orders requiring tech companies to act against perpetrators. The bill would have made it easier for Herrick to compel Grindr to help him, had it been enacted before his litigation. It was reintroduced in February.

The EARN IT Act, a bipartisan bill spearheaded by Sen. Lindsey Graham (R-S.C.), would remove tech platforms’ blanket immunity against third-party content depicting child sexual abuse and create a new commission to establish “best practices” on the prevention of child exploitation online. Match Group broke from other major tech companies by endorsing the bill in 2020. Graham and Sen. Richard Blumenthal (D-Conn.) reintroduced the measure in April.

Lawmakers have also zeroed in on dating app safety. In 2020, a House Oversight subcommittee led by Rep. Raja Krishnamoorthi (D-Ill.) launched an investigation into the underage use of dating apps, but the panel hasn’t provided any public updates on its probe. (A committee staffer said in December that the work was “ongoing.”) That year, Reps. Jan Schakowsky (D-Ill.) and Annie Kuster (D-N.H.) also requested information from Match Group about its efforts to respond to reports of sexual violence. Evidently, Match Group did not ease their anxieties. “We continue to be concerned about reports of sex offenders and dangerous individuals who have accessed your platforms and used them to commit assaults on new victims,” Kuster and Schakowsky wrote in a follow-up letter to Match Group CEO Bernard Kim in July. Kuster previously told me that Congress “must thoroughly examine what dating apps are doing to keep users safe and what new regulations are needed to stop crimes and abuse from being committed on these platforms.”

In 2022, Rep. David Valadao (R-Calif.) introduced legislation that—without altering Section 230—would require dating apps to verify users’ identities with government ID. Valadao said his legislation was important given stories like The Tinder Swindler, a Netflix documentary about a con man who used the app to scam women out of an estimated $10 million. The legislation also may have been born of political motivations: Valadao’s Democratic opponent in 2022, former California State Assemblyman Rudy Salas, had been caught deflating his age on dating apps by 11 years.

After winning reelection, Valadao has yet to reintroduce his bill, though his office says he plans to do so this term. Records show Match Group lobbied congressional offices on Valadao’s bill.

Meanwhile, some dating apps have gradually started to introduce new safety measures intended to reduce harm and harassment. Bumble launched a partnership with Bloom, which provides free remote courses on surviving trauma to its users. Bumble also offers an optional photo verification tool and AI safety monitoring. Since 2018, Grindr says it has been able to block users from creating multiple profiles using the same device. (That wouldn’t have stopped Herrick’s ex, who used multiple burner phones.) The app also launched an AI screening tool for app usage that violates its terms of service, and is moving storage of direct messages to the cloud, rather than on individuals’ devices, so as to help law enforcement in instances of criminal abuse. Match Group products have their own optional photo verification tools, and they, too, use AI to detect harmful language. “The safety of users is paramount and our brands’ work to build safer dating platforms is never-ending,” a company spokesperson says.

In 2022, Match Group partnered with Garbo, an online background check platform that collects arrest, conviction, and sex offender registry records. Through Garbo, some Match Group users can pay to screen their dates. So far, the tool appears on Tinder, Match.com, Plenty of Fish, and a single-parent dating app called Stir. With two free searches and $3.25 a search thereafter, it’s far more affordable than professional public records databases, like LexisNexis; easier for users than navigating state-by-state court-record portals; and more reliable than DIY online “people searches.”

Some experts, though, worry about the approach. Valentine, who ran the Utah dating app study, says that while tools like Garbo are a good first step, putting the onus on users to prevent their own abuse potentially “leads to victim-blaming” should they fail to unearth a red flag ahead of time.

Garbo has other limitations. It can’t consistently screen for restraining orders, because courts in most jurisdictions don’t make them easily accessible online. The same is often true for other types of public records. Users with insufficient information about their prospective dates—such as not knowing the person’s phone number or exact date of birth—may struggle to locate records. Garbo founder Kathryn Kosmides thinks many of these issues could be solved by reforming public records laws. “You can’t say, ‘We’re going to require dating apps background check people,’ and then have a horrible public record infrastructure that makes it nearly impossible,” she argues. “You can’t have it both ways.”

No country currently requires dating apps to check the backgrounds of their users, though increasingly foreign governments are making platforms beef up their safety and deception standards through other means. Japan has compelled Tinder to verify age with ID. The European Union is also in the process of rolling out its Digital Services Act, which gives the EU the power to levy large fines on big tech companies that fail to act against problematic content to which they were alerted. Platforms with an EU presence of more than 45 million users—like Facebook and TikTok—are now obligated to undergo independent audits of their risk management measures and take steps to improve safety for women and minors, among other things.

In the US, some states are mulling measures to make the internet safer, particularly for kids. This year, Utah passed legislation requiring age verification for any resident who wants to use social media in the state and allowing lawsuits on behalf of those who allege that social media caused them harm. California has also passed a law, slated to take effect in July 2024, that compels online platforms that are “likely to be accessed by children” to implement measures to protect them. In practice, doing so may require platforms to make age verification mandatory, which may entail requesting government ID.

Such laws will inevitably draw court challenges. But their passage, paired with the judicial system’s increased acceptance of product liability arguments in tech cases and international efforts to increase enforcement power against online platforms, suggests tech companies may soon be more vulnerable to civil claims than they were when Herrick took Grindr to court.

That’s been Herrick’s goal since he first sued Grindr: not to repudiate dating apps, but merely to ensure they create “a safe environment for their users and actually go to the lengths to protect them in extreme circumstances of abuse,” he says. His efforts haven’t been entirely futile. In 2020, Supreme Court Justice Clarence Thomas cited his case in a statement welcoming the reexamination of Section 230. Last year, the Biden administration asked Herrick to advise a White House Task Force on combating online abuse. Many people might consider adding such unique accomplishments to their profiles on dating apps—but Herrick, understandably, still won’t use them.

var $ = jQuery;$(document).ready(function(){var $area = $('#header-image');var vid = $('');$area.find('img').remove();$area.prepend(vid);$('.entry-header .wp-caption-text').css({'z-index':1000});$('.entry-header .media-credit').text('Illustration by Petra Eriksson');});

Related Posts