Skip to content
A hooded figure pulls black strings tied to a person's hands, symbolising manipulation in social engineering attacks

Social Engineering Attacks: Understanding The Psychology Behind It

Social engineering attacks are as much a psychological game as a technology one. Do you know how to stop them?

Introduction

Social engineering attacks are as much a psychological game as a technology one.

It refers to a manipulation technique that tricks people into giving away confidential information or performing risky actions, such as clicking on malicious links or transferring money.

It’s a chess play that bypasses technical defences and destroys inner peace, leaving people miserable. Think of it as psychological hacking. Instead of breaking into your system, the attacker convinces you to open the door for them. 

This style of cyberattack is known to have a terrific, and terrifying, success rate, with 90% of cyberattacks leveraging social engineering

Thus, in this blog, we will discuss social engineering and its impact on human behaviour using a theoretical framework, the Theory of Planned Behaviour, while touching on different types of social engineering attacks. 

Psychological Manipulation at Its Finest 

The human mind is extraordinary and has remarkable capabilities; however, with many capabilities come many vulnerabilities, which cybercriminals can exploit and take advantage of. These weaknesses all begin from cognitive biases, social instincts, and emotional triggers. 

Cognitive Biases:

In this fast-paced world, we all love to take shortcuts so that we can get to our destinations quicker. Our brains work in the same way, where we have made up biases based on our surroundings that allow us to process information quickly and make a decision.

This can leave many people vulnerable to deception, especially now that technology has advanced to the point where hackers can gain access to our data and customise attacks based on our thinking patterns. Sound scary?

Let’s dive deeper into these cognitive biases…

Availability Heuristic

We judge how likely something is based on how easily we can recall similar events. If a hacker references a recent data breach in the news, we are likely to believe it and act on a fake alert because we remember it happening. 

Anchoring Bias

The first piece of information we see often becomes our baseline. An attacker may set expectations with a fake invoice or a discount offer; everything that follows feels relative to that “anchor.” 

Overconfidence Bias

People tend to overestimate their ability to spot scams. Ironically, this makes them even more vulnerable. A convincing attacker can slip past their guard.

Loss Aversion

We hate losing more than we enjoy winning. Scammers exploit this by warning victims of losses like an account being closed or leaked data, prompting rash decisions to “prevent” it.

Time and Cost Fallacy

When we’ve already invested time or money, we’re more likely to keep going even when it’s clearly a bad idea. Attackers may lure victims further down a scam by first asking for small commitments.

Recency Bias

We give more weight to recent information. If someone just read about a new malware scam, they may overlook an older tactic that’s making a comeback.

Social Proof

We follow the crowd, especially when things are uncertain. Hackers might reference how “others are already doing it” or fabricate reviews/testimonials to build trust.

Framing Effect

How something is presented affects how we perceive it. A message framed positively (“Act now to secure your bonus!”) or negatively (“You’ll lose access if you don’t act”) can change how we react, even if the core information is the same.

Beyond Bias: The Emotional and Social Hooks

Hackers don’t stop at cognitive biases. They also play on our emotions and instincts, such as trust and fear.

Authority and Obedience

We’re taught to respect and obey authority. Social engineers mimic figures of power like managers, IT personnel, or law enforcement to pressure people into doing things they wouldn’t otherwise do.

Just imagine:
A hacker impersonates a senior executive from an organisation and emails a junior employee. The message says there’s an urgent issue and requests immediate access to sensitive files. Not wanting to disobey, the employee complies by handing over confidential data to an impostor.

Reciprocity and Trust

People naturally want to return favours. If someone helps you even in a small way, you’re more likely to help them back. Social engineers use this principle to gain trust before exploiting it.

For example, you may receive an email from what appears to be a trusted software provider offering help with a critical update. They may also offer a phone consultation, during which the caller builds rapport by demonstrating technical knowledge. 

Eventually, they ask for your login credentials “just to complete the setup.” You oblige. Moments later, you click a link they send you, unknowingly downloading malware that grants them remote access.

Fear and Emotional Triggers

When emotions like fear or curiosity take over, rational thinking often takes a back seat. Attackers capitalise on these emotions to prompt hasty clicks and poor decisions.

For example, you get an alarming email saying your private data has been exposed online, referencing specific websites you’ve visited to make it seem credible. It urges you to act quickly by clicking a link to “secure your data.” Overwhelmed by fear, you click and infect your system with malware.

Hence, it is important to think before you click. Calm down, take a deep breath, and consider all the ways cybercriminals can harm you!

Together, all these aspects and factors interconnect with the Theory of Planned Behaviour (TPB)

I’ll explain how.

Theory of Planned Behaviour 

The Theory of Planned Behaviour (TPB) theorises that human behaviour is influenced by behavioural, normative, and control beliefs, which collectively shape attitudes, subjective norms and perceived behavioural control.

  1. Attitude towards the behaviour – How positively or negatively someone evaluates the outcome of the action.
  2. Subjective norms – The perceived social pressure to perform or not perform the behaviour. Because everyone else is doing it doesn’t mean you need to. Think before you act!
  3. Perceived behavioural control – The ease or difficulty someone feels in performing (or resisting) the behaviour. In cybersecurity, there is no room for laziness, you must verify every message and information you receive, regardless of how confident you may feel and how long it may take. 

Together, these form the intention behind someone’s actions.

A successful social engineering attack works not because the target is foolish, but because the attacker has subtly (or sometimes aggressively) ticked all three boxes of the Theory of Planned Behaviour:

How Hackers Exploit It
Attitude Make the request seem safe, routine, or beneficial (e.g. “update your account,” “claim your refund”)
Subjective Norms Imply that others are doing it, or that someone in authority expects it (“Your manager requested this transfer”)
Perceived Behavioural Control Create urgency, fear, or doubt, making the victim feel like they don’t have time or knowledge to double-check

This is why we call social engineering attacks psychological manipulation, because it takes into account your attitudes, subjective norms and perceived behavioural control to craft the ‘perfect’ cyberattack that implements minimal technical expertise. 

Different Types Of Social Engineering Attacks 

Now we understand the psychological games behind a social engineering cyberattack.

Let’s see how it’s used in social engineering attacks.

  1. Phishing: The classic one. A fraudulent email is designed to trick people into clicking links, downloading attachments, or entering login credentials.
  2. Spear Phishing:  More targeted than generic phishing. Attackers research their victims and craft personalised emails that feel legitimate.
  3. Vishing: Voice phishing via phone calls pretending to be bank support, tech help, or even law enforcement.
  4. Smishing: SMS phishing, like a fake delivery notification or a security alert asking for account information.
  5. Baiting: Leaving infected USB drives or links in public places (digital or physical) with enticing labels like “Company Payroll” or “Confidential.”
  6. Pretexting: Creating a fake scenario or identity to gain trust, like pretending to be an IT technician needing login credentials to fix a problem.
  7. Quid Pro Quo: Offering something in return for access, like “free” software or gift cards in exchange for login details or network access.

How to Prevent Social Engineering Attacks from Occurring Using the Theory of Planned Behaviour (TPB)?

In this section, we will show how individuals and businesses in the legal, finance, operational technology, and education sectors can use the Theory of Planned Behaviour (TPB) to prevent social engineering attacks from succeeding.

Legal:

Strategy
Attitude Conduct scenario-based training to reframe compliance: make verification procedures feel just as important as legal diligence.
Subjective Norms Encourage a culture where juniors feel safe challenging instructions that bypass protocol, even if they appear to come from partners. Normalise slow, thoughtful responses.
Perceived Control Implement Pause & Verify systems and internal playbooks that give employees scripts and tools to validate unusual requests.

Finance:

Strategy
Attitude Frame risk-checking habits as asset protection, not delay. Use gamified training to show that “slowing down” prevents financial losses.
Subjective Norms Highlight leadership’s commitment to security. Use real examples where cautious actions saved the company from loss.
Perceived Control Equip employees and clients with simple decision trees and visual flowcharts to assess email validity or flag suspicious actions.

Other strategies can include further biometric and multi-factor authentication, and fake simulations of phishing emails.

Operational Technology (OT):

Strategy
Attitude Frame security compliance not as an inconvenience but as a part of system safety — align it with operational safety culture.
Subjective Norms Use team briefings to reinforce that even the most technical employees fall for social engineering unless vigilance is shared.
Perceived Control Train employees to use questioning techniques without fear of backlash “Who are you? Who sent you? Can I verify that?” – Question everything!

Other Strategies that OT can use:

  • Strict on-site identity verification for contractors or new engineers.
  • Physical baiting simulations: leave USBs in the parking lot and train staff not to plug them in.

Education:

Strategy
Attitude Promote cybersecurity as a life skill. Frame checking links and reporting scams as “digital hygiene” rather than paranoia.
Subjective Norms Build peer-led campaigns with student ambassadors who model secure behaviour and expose phishing attempts publicly.
Perceived Control Create friendly, non-judgmental reporting systems. Even if students or staff click a bad link, make them feel empowered to report it quickly.

Want to learn more about social engineering in education? Check out our article on phishing and social engineering in high education.

The Final Word

Hackers understand that the weakest link in any security system is often human nature itself, this is why social engineering attacks are by far the most common. By studying our biases, emotional responses, and social behaviour, they create strategies that bypass even the most advanced security measures.

Especially with artificial intelligence (AI) on the rise, attackers can now:

  • Craft perfect phishing emails with minimal grammatical mistakes.
  • Use data from social media platforms to understand how different people write and view certain topics.
  • Generate deepfake voice or video messages that impersonate real people.

Psychological resilience is the one of the most underrated cybersecurity defences. Especially for combating social engineering attacks.

By embedding behavioural theory into staff training, digital policies, and everyday conversations, industries like law, finance, OT, and education can evolve from reactive to proactive.

The ultimate goal?

To make security not just a technical shield, but a cognitive habit.

To think before clicking, to pause before acting, and to lead with awareness, not fear.


Recent posts

GDPR Compliance Checklist for Law Firms: Avoiding Data Breaches and Regulatory Fines

Read more

Understanding The Digital Operational Resilience Act (DORA) For Financial Compliance

Read more

The Growing Threat of AI-Powered Cyber Attacks in Industrial Systems

Read more

Red Team vs Blue Team: How Operational Technology (OT) Organisations Can Strengthen Cyber Defences

Read more