3. Policy Making
Helps governments make informed digital policies (e.g., digital
inclusion, cybersecurity laws).
Evaluates tech impact on society before large-scale adoption.
Example: The development of Malaysia’s Personal Data Protection
Act (PDPA) 2010) was influenced by the growing concerns over
personal data misuse in the digital age.
Technology Design
Guides developers to consider social, cultural, and ethical factors
in design.
Promotes inclusive and user-centered systems.
Example: Screen readers and voice assistants for people with
disabilities reflect socially informed design
4. Education
Helps educators understand the digital divide and design
equitable e-learning tools.
Supports responsible digital citizenship.
Example: Malaysia’s DELIMa platform gives students access to digital
tools, closing the education gap during the pandemic.
Business Strategy
Assesses social behaviour in online platforms (e.g., consumer trust,
data ethics).
Improves human-centered innovation in digital services.
Example: Grab uses location data and user feedback to optimize
services, but must balance that with data privacy concerns.
5. Healthcare Services
Ensures digital health apps are accessible to all communities.
Studies how people interact with telemedicine and e-health platforms.
Example: MySejahtera app in Malaysia: Tracks vaccination, check-ins, and
health updates — but raised public concerns about privacy.
Public Communication & Social Media
Helps analyze the social effects of algorithms (e.g., misinformation, echo
chambers).
Encourages ethical content moderation and digital literacy.
Facebook, Instagram, WhatsApp, Tiktok, Tinder, and etc.
Example: Platforms like Facebook and TikTok must deal with fake news,
hate speech, and mental health impacts.
7. Digital Divide
Unequal access to technology creates social inequality.
Marginalized groups may be left out of digital solutions.
Example: Rural areas in Malaysia may not have fast internet for
online learning or telehealth services.
Privacy & Data Ethics
Increasing data collection raises concerns about surveillance and
misuse.
Many users lack awareness of how their data is used.
Example: Apps like Google Maps and shopping platforms track user
behavior for ads — but few understand or consent meaningfully.
8. Technology Bias
Algorithms may reflect or worsen societal biases (e.g., race,
gender).
Need for transparency and fairness in system design.
Example: AI-based hiring systems can favor certain demographics if
trained on biased data (e.g., white-collar male applicants).
Resistance to Adoption
Cultural or generational gaps may hinder tech use.
Trust issues in new technologies (AI, blockchain, etc.).
Example: Elderly users might struggle with mobile banking apps or QR
code systems.
9. Information Overload
Too much digital information can confuse or mislead users.
Increases spread of fake news and reduces attention spans.
Example: During COVID-19, misinformation spread rapidly through
WhatsApp and Facebook, confusing the public and affecting health
decisions.
Lack of Multidisciplinary Collaboration
Engineers, social scientists, and policymakers often work in silos.
Limits holistic solutions that address both tech and social impact.
Example: A smart city plan may fail if it doesn’t consider the needs of
marginalized groups like the elderly, disabled, or low-income
households.