Thursday, February 26, 2026

Deception in Social Media (Part 2)

Equip everyone to recognise, resist, and respond wisely to digital deception through biblical discernment.

We begin where Part 1 left off: deception’s mechanics are ancient, but today’s platforms accelerate those mechanics and rewire daily habits. The second half of this short series turns from diagnosis to anthropology—how social media changes minds and bodies now—then moves into concrete pastoral and practical responses. We’ll analyse a popular explainer video, survey psychological and neuroscientific findings, name the ways influencers and comparison culture distort identity, outline the new threats from AI and deepfakes, and finish with biblically rooted guardrails you can put into practice this week. This is intended for personal reflection, small-group discussion, and sermon or teaching use.

5 Crazy Ways Social Media Is Changing Your Brain
A Quick Analysis

A useful example to this conversation is the short video AsapSCIENCE produced titled “5 Crazy Ways Social Media Is Changing Your Brain Right Now.” The video condenses several lines of scientific inquiry into five accessible claims: (1) social media functions more like a psychological addiction than a substance addiction for some people; (2) constant notifications and instant rewards re-engineer attention; (3) social feedback (likes/comments) triggers reward circuitry much like other pleasurable stimuli; (4) phantom vibration/ringing phenomena reveal how sensory expectation rewires sensation; and (5) heavy multitasking and curated presentation change how we process and remember information. The short format is helpful for awareness because it links everyday experience to measurable brain responses, but it also simplifies complex research into bite-sized claims; still, as a conversation starter, it performs well. 

The video’s central thesis—that social media creates patterned, reward-driven behaviour—connects to a growing body of work on the brain’s reward system. Experimental neuroscientific studies using fMRI have shown that social feedback (for example, “likes” on a photo) lights up the nucleus accumbens and related reward areas, especially in adolescents exposed to peer evaluation. That neural response overlaps with circuits implicated in other reinforcing activities, explaining why intermittent positive feedback can become habit-forming: the brain learns to expect a reward after an action and will seek that pattern again. This is not moralising the technology as inherently sinful; it is describing an architecture of desire that can either be stewarded or exploited.

The video also highlights "phantom vibration syndrome"—those moments when your pocket seems to buzz though the phone is silent—as a small but telling symptom of techno-adaptation. Research among college students and medical interns finds phantom vibrations are common: many participants report experiencing them regularly, suggesting that persistent notification modes can train sensory expectations and increase distractibility. That everyday oddity points to the larger fact that our bodies can be shaped by notification economies—our nervous systems learn to anticipate digital contact. 

Two short cautions about using the video as evidence: first, short explainer videos necessarily compress nuance—studies differ in populations, measures, and effect sizes—so it’s important to move from the video to primary studies for claims you will teach or publish. Second, neurological activation does not by itself give moral meaning; the gospel tells us how to steward desire and what flourishing looks like, not the scanner. Still, the convergence of experiential testimony (people feel hooked), behavioural metrics (compulsive checking), and neural data (reward circuitry activation) is a strong pastoral signal: practices that repeatedly trigger high-arousal reward deserve intentional limits.

Influencer Culture, Comparison, and Identity Distortion

The second major modern vector of deception is the influencer economy. Influencers (people who reliably command attention in specific niches) are now a core advertising channel: brands pay to reach audiences through personalities who can translate product into lifestyle. The influencer industry is large and growing; reports estimate multi-billion dollar market sizes and show that brands regularly amplify influencer content through paid partnerships and media budgets. Influencers can be genuine storytellers who build community, but they can also be curated personas whose livelihood depends on appearing to have a life worth copying. That dynamic monetises aspiration and turns vulnerability into inventory. 

When social media treats life as a market of attention, comparison becomes the routine medium of exchange. Comparison is not morally neutral: it can be a healthy spur to growth when it provokes humility and learning, but the feed often weaponises comparison, producing FOMO (fear of missing out), chronic insecurity, anxiety, and, in some cases, depression. Platforms profit when users experience small psychic deficits—if you feel you lack something, you will return to the feed hoping to find and buy a fix. The apostle Paul’s wisdom about honest self-examination is relevant here: “But let each one examine his own work, and then he will have rejoicing in himself alone and not in another” (Galatians 6:4). That counsel redirects the benchmark from the algorithm’s applause to God-centered self-examination. 

Influencer culture also produces identity distortion. Online identity invites constant editing: filter the face, script the story, trim doubts, and present a streamlined maturity. This is tempting—who does not want to be seen in the best light?—but theologically it is dangerous because it trains self-deception: when our public persona is primarily a curated performance, our private soul can wither. The psalmist counters that truth with a grounding line of worth: “I will praise You, for I am fearfully and wonderfully made; Marvellous are Your works” (Psalm 139:14). Identity rooted in Christ resists the performance loop because worth becomes gift, not metrics.

The Comparison Trap:
How Platforms Monetise Insecurity?

Meta (the company that owns Facebook and Instagram) and other platform corporations design engagement systems—likes, follows, algorithmic feeds, recommendation loops—that reward emotionally charged content and frequent interactions. While algorithms are neutral tools engineered to maximise engagement metrics, the consequence is that content which elicits envy, outrage, or longing receives disproportionate exposure. Platforms, therefore, have an economic incentive to show material that keeps people coming back; one unintended moral effect is the monetisation of insecurity. That is, anxiety and aspiration become raw material for advertising economies, not merely side effects. 

This dynamic explains why influencers often sell not only products but identities and ideals—lifestyles, body standards, romantic narratives, political stances—because those narratives drive engagement and purchases. Some influencers are authentic advocates; many are professional creators whose income depends on sustained desirability and visibility. The healthy Christian response is not merely to condemn but to recover practices of gratitude, modesty, and neighbourly truth-telling that re-orient identity away from metrics. Scripture’s repeated warnings to guard the heart (e.g., Proverbs 4:23) push back against a life governed by audience metrics rather than God’s Word.

Fake News, AI, and Deepfakes
New Tools for Old Deceptions

The landscape of deception has been technologically transformed by three related developments: the persistence of manipulated headlines and out-of-context clips (traditional misinformation), the rapid rise of AI tools that can generate plausible images and text, and the acceleration of deepfake technology that can produce deceptively realistic audio and video. Fake news still relies on many of the same tactics we described in Part 1—partial truths, emotional hooks, manufactured identity—but AI lowers the cost and increases the believability of forged content. The technical difference is a shift from labour-intensive forgery to algorithmic generation at scale. Scholarly reviews and industry reports show this trend clearly and warn that detection will always be playing catch-up with generation. 

Deepfakes in particular pose new moral and civic hazards: they can fabricate speeches, produce false accusations, and weaponise intimate imagery to destroy reputations. Governments, financial institutions, and security agencies now treat deepfake risk as a real security challenge because convincingly altered media can be used for political manipulation or to erode trust in institutions. The spiritual cost is also high: when people become accustomed to doubting every piece of evidence, public trust frays and falsehood gains a new means of influence. That does not mean we must withdraw from civic conversation, but it does mean we must learn evidentiary practices—source verification, cross-checking primary footage, and preferring trusted, accountable outlets—to prevent gaslighting at scale. 

AI also democratizes image and text creation: synthetic photos, AI-authored quotations, and fabricated brand accounts can be created with minimal technical skill. That amplifies the earlier point about manufactured identity—now anyone can make a convincing persona with very little ethical accountability. The pastoral task is to strengthen communal norms: require named authors, insist on transparency for sponsored content, and cultivate public literacy that asks about provenance (who made this, why, and how can it be verified?).

Dangers and Real Harms: Examples from Research

The harms described above are not abstract. Systematic reviews and longitudinal studies link compulsive or problematic social media use to increased risk of anxiety, depressive symptoms, and greater emotional difficulties in some populations—especially adolescents whose social development is still forming. Other large studies nuance the picture, showing that context and type of use matter: active, socially supportive use is less likely to harm than passive, comparison-driven scrolling. The mixed findings caution against simple alarmism but point to an important truth—how we use platforms matters as much as how much we use them. 

Other empirical findings we have already mentioned bear repeating because they are practical: (1) social feedback activates reward circuitry; (2) phantom vibration and ringing phenomena are common and correlate with stress and frequent phone use; and (3) heavy multitasking tied to frequent switching between apps reduces the ability to sustain attention and commit information to long-term memory. Taken together, these findings tell a pastoral story: persistent, reward-driven engagement reshapes attention, emotion, and memory—core capacities for spiritual formation—and therefore demands intentional disciplines. 

Biblical Responses:
Unchanging Strategies for an Unchanging Enemy

Theologically, none of these technological developments is new in kind—only in scale. Satan’s strategy of mixing truth with lies, appealing to desires, and promising autonomy repeats from Genesis into the feed. Jesus’ counter to deception is not clever debunking but the person and claim of truth: “I am the way, and the truth, and the life” (John 14:6 NASB). Correct belief about Christ reorders affection and anchors identity in a Word and Person rather than an algorithm. Christian formation, therefore, includes cognitive work (learning to verify, to read context) and deep soul work (practices that cultivate gratitude, Sabbath, and neighbourly confession). Christian discernment is public and communal; it is not merely private scepticism but the recovery of practices that form people into those who love God and neighbour faithfully.

Practical Guardrails
Five Digital Disciplines

Practical spiritual life in digital culture requires habits. Here are five simple disciplines to practice this week:

  1. Pause before you share. If a post provokes a hot reaction, wait—get a primary source; confirm context; ask who benefits. This delays the arithmetic of virality and restores deliberation.

  2. Verify before believing. Track claims to original reporting, screenshots to primary documents, and images to reverse-image searches when necessary. Favour reputable outlets with transparent sourcing.

  3. Limit screen time. Build intentional windows of use and device-free rituals—meals, prayer times, and the Sabbath—to reclaim rhythm and attention.

  4. Fast from social media. Practice short fasts (a day, a weekend) or longer sabbaths to test the soul’s dependence and to practice contentment apart from food. Structured fasts reveal appetites and open space for prayer.

  5. Anchor identity in Christ. Regular confession, Scripture memorisation, small-group accountability, and service reduce the tyranny of appearance and shift worth from metrics to grace. These disciplines are not legalistic controls but means of grace to reshape desire.

Each discipline maps to practical tools (screen limits, notification settings, curated feeds) and spiritual practices (examen, confession, corporate worship). In combination, they form both firewall and furnace—protection and formation.

Short Action Plan for Church Leaders and Families

Leaders can translate the five disciplines into practical ministries: teaching a short sermon series on digital discipleship, running a weeklong congregation-wide social media fast, teaching teens how to verify sources, and establishing small group check-ins where people confess what they notice about their feeds. Parents should consider age-appropriate rules, shared family tech agreements, and modelling restraint publicly—children mirror the adult heart. Churches are especially well-suited to offer communal accountability because formation happens best inside relationships, not solo willpower.

Reflective Questions for Personal and Group Use

Discernment begins with honest questions. Try these in a small group or journal: What lies have I believed because of social media? What emotion typically drives my scrolling (boredom, envy, loneliness, anger)? What would change in my feed if I removed one app for a week? Which relationships would I invest that are being neglected because of attention to my persona? Honest answers to these questions often point to simple next steps that lead to lasting reformation.

Short Case Example

Imagine a pastor who sees a viral clip attacking a public figure from a local church. The clip is compelling, shared by friends, and generates outrage. The pastor’s response could be reactive—retweeting condemnation—or deliberative—pausing, seeking the unedited clip, contacting primary sources, praying, and then speaking with measured charity. The second path costs time and may reduce short-term engagement, but it preserves the pastor’s integrity and forms the congregation toward truth. The church needs leaders who model slow, faithful responses in a hurry culture.

The Long Game: Formation over Correction

Short interventions—flagging a post, debunking a viral lie—matter, but they are not enough. The sustained task is formation: teaching habits of attention, developing moral imagination, practising confession, and embedding digital fasting into spiritual rhythms. Scripture’s pedagogies—repetition, liturgy, community—are precisely the tools needed to withstand the attention economies. When Christians recover these practices, they become sources of clarity: communities that do not merely refute lies but embody an alternative way of living.

Closing exhortation

Technology will continue to change; culture will invent new devices and persuasive architectures. Our lasting hope is not to master every new tool but to be mastered by Christ, who forms hearts that can say “no” to deceptive enticements and “yes” to lives of truth and love. Ask yourself honestly: what do I need to stop, start, and sustain? Then invite a friend or a small group to join you in those next steps. The work of discernment is both individual and communal; it is cognitive and spiritual; it is immediate and lifelong.


References

Altuncu, E., et al. (2024). Deepfake: definitions, performance metrics and standards. Frontiers in Computer Science. https://doi.org/10.XXXX/XXXXX. (PMC)

AsapSCIENCE. (2014, September 7). 5 Crazy Ways Social Media Is Changing Your Brain Right Now [Video]. YouTube. https://www.youtube.com/watch?v=HffWFd_6bJ0. (Glasp)

Drouin, M., et al. (2012). Phantom vibrations among undergraduates: Prevalence and correlates. Computers in Human Behaviour, 28(6), 2374–2379. https://doi.org/10.1016/j.chb.2012.07.015. (ScienceDirect)

Keles, B., et al. (2020). A systematic review: the influence of social media on mental health in adolescents and young people. Journal of Adolescence, 79, 1–14. (tandfonline.com)

Linqia. (2023). The State of Influencer Marketing 2023 [Report]. Linqia. (Linqia)

Mangot, A. G., et al. (2018). Prevalence and pattern of phantom vibrating and ringing among medical interns. Journal of Clinical and Diagnostic Research, 12(9), LC01–LC04. (PMC)

Meta. (2021, October 28). The Facebook company is now Meta [blog post]. Meta. https://about.fb.com/news/2021/10/facebook-company-is-now-meta/. (Facebook)

Sensity (Deeptrace). (2020–2023). State of deepfakes / deepfake threat intelligence reports. Sensity AI. https://sensity.ai/reports/. (Sensity)

(Selected neuroscience reporting) Wired. (2017). What social media does to a teenager's brain (summary of UCLA study). Wired. (WIRED)

(Selected review on social media and health) Kelly, Y., et al. (2018). Social media use and adolescent mental health. The Lancet eClinicalMedicine, 1, 12–23. (The Lancet)