Why Trust is Data’s Only Real Currency

Mark Twain once remarked that there are “lies, damned lies, and statistics.” A century later, the line still gets a chuckle. Not because statistics are inherently dishonest, but because we all recognize their power.

    Get the Full Story

    Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.

    yesSubscribe to our daily newsletter, PYMNTS Today.

    By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions.

    Data has always shaped how people understand the world and decide what to do next. For most of history, the limitation wasn’t whether data had value, but whether it could be gathered, processed and applied quickly enough to matter.

    In the analog world, information moved slowly.

    It wasn’t that long ago that the best way to brief an executive was to drop a giant blue book crammed with pages and pages of data onto their desk and hope they had the stamina and enough caffeine to find the relevant insights.

    Merchants tracked sales and debts using paper ledgers. Governments conducted censuses walking door to door. Economists took weeks to pore over quarterly reports. The Fed’s Survey of Consumer Finance was conducted every three years. By the time the numbers arrived two years later, the conditions they described were already shifting.

    Advances in computing power, the drop in storage costs and the rise of the cloud have helped speed things up. Suddenly, vast amounts of data could be captured, stored and analyzed in near-real time. Data suddenly became a powerful, underutilized resource that could fuel innovation and competitive advantage in a more relevant timeframe for decision-making.

    Advertisement: Scroll to Continue

    That turned timely data into a form of currency. Something companies could exchange for value.

    But like any currency, data’s value depends on trust. In today’s economy, that trust is being tested, challenging how data is collected, interpreted and used.

    The Trust Factor

    Highly valuable data, like that collected by Google, Amazon, Meta and other tech giants, is built on trust and scale. These companies have access to vast amounts of data generated by billions of users every day, collected through search activity, browsing behavior, purchasing patterns and even location tracking. The scale and richness of this data provides unique insights into consumer behavior, preferences and trends in real time. It powers their algorithms, which help them get more users and even more data.

    The real value of this data lies not just in its quantity but in its accuracy, timeliness and reliability. Google and Amazon, for example, use data in real time from search queries and purchases to shape everything from targeted advertising to product recommendations, creating a feedback loop that drives innovation and fuels billions more in sales. When properly managed and analyzed, this data allows businesses to personalize experiences, predict trends and improve how they operate using a data-driven foundation.

    When Data Goes Sideways

    Then there’s the data that is unreliable, often due to fraud. Whether through manipulated transactions, misleading advertising metrics or fake reviews, fraudulent data can severely distort business decision-making. Without the promise of data integrity, decision-making becomes a gamble rather than a solid strategic foundation. In today’s AI-bot intensive world, the prevalence of fake data can make it difficult for businesses to tell the difference between what’s real and the picture of reality that the bad bots paint.

    Getting to the ‘Why’ Behind the ‘What’

    Transactional data tells you what happened. A purchase, a click, a payment. But without the “why,” it’s just a headline without the story.

    A sudden drop in sales could be because of a competitor’s promotion, a supplier shortage or shifting consumer priorities. A spike in cross-border transactions might reflect new merchant acceptance and expansion, currency swings or more consumer and business travel. Freight delays could be regulatory, logistical or demand-driven. Without proper context, you’re left guessing.

    Qualitative data fills the gap. It reveals motivations, preferences, and emotional drivers. It helps predict future behavior, guide product strategy and spot trends before they hit the radar. It’s what turns raw signals into the insights that form sound strategy.

    Government statistics have their place, but they trail reality. That matters when markets are volatile or accurate data are needed right now. Employment figures, GDP reports, and trade stats offer historical snapshots, often revised later. Even trendlines from decades of Bureau of Labor Statistics data can lag behind today’s dynamics, where monetary policy shifts overnight. That’s why fast signals must be paired with the context and process to make them relevant.

    When quantitative data confirms something is happening and qualitative data explains why, you get the full picture.

    When Trust is Broken: Lessons From Argentina and Enron

    In 2007, Argentina’s government removed Graciela Bevacqua, head of the national statistics agency, after she refused to alter inflation figures. As The Wall Street Journal recently reported, her replacement changed the way the Consumer Price Index was calculated.  Official figures showed inflation at 8.5%. Independent economists put it closer to 25%. The manipulation saved the government billions in inflation-linked payouts but drew sharp rebuke from the IMF and caused citizens and businesses to turn to unofficial data sources. Even after a later administration restored transparency, it took years for credibility in government data to be restored.

    In the corporate world, Enron’s collapse in 2001 resulted in the same erosion of trust. Using “mark-to-market” accounting, Enron booked future projected profits as if they were already realized and hid debt in off-balance-sheet entities. The company’s financial statements presented a picture of stability even as the underlying business was crumbling.

    When the facts were surfaced, Enron filed for bankruptcy, investors lost billions and employees lost jobs and pensions. Enron’s collapse prompted Congress to pass the Sarbanes-Oxley Act of 2002 to mandate tougher audits, CEO/CFO certifications, and stronger internal controls to restore investor trust.  It was a lesson that you couldn’t always trust company data.  About 20 years later, Wirecard is a reminder that even with audits from a big-four firm (E&Y in this case) bad data can still snooker investors.

    The Science and the Art of Great Data

    Big data isn’t the same as good data. Gathering and analyzing millions of data points doesn’t guarantee that you have the right ones. Or that you understand what they’re telling you. In fast-changing markets, credibility starts with the science of collection, because if the foundation is flawed, the insights will be too.

    That means asking the right questions without bias and designing survey instruments that do not “lead the witness.” Building samples that are representative of the population you want to understand, not the easiest group to reach. Verifying respondents are real and answering truthfully, something that’s becoming increasingly tough in the age of AI bots as respondents. Collecting data in a consistent, transparent way to protect respondents’ privacy and the integrity of the research project. And checking findings against reputable sources before publishing.

    The art is turning those numbers into business insights. Outliers can be early warnings or mistakes, but without context it’s hard to know. Patterns that look obvious may hide deeper causes. Context transforms data into compelling, data-driven calls to action.

    Phoning a Friend About Data Integrity

    I asked GPT-5 what makes data credible. Its answer: transparency, accuracy, completeness, timeliness, objectivity, reproducibility, ethical safeguards, provenance tracking, honest acknowledgment of limitations and ongoing updates to keep it relevant.

    A good checklist. The real work is living up to it.

    I wanted to see if, according to GPT-5, PYMNTS Intelligence data did. So, not knowing what I’d get back, I asked the question.

    Here’s GPT’s response, unedited:

    PYMNTS Intelligence data is widely recognized for its credibility and reliability, which stem from several key factors that contribute to the confidence users place in it. Here are the primary elements that make PYMNTS Intelligence data credible:

    1. Rigorous Methodology

    • Data Collection Standards: PYMNTS follows a structured, consistent approach to data collection, ensuring that the data is gathered through reliable, scientifically sound methods. Whether using surveys, interviews, or transaction data, the process is transparent and follows industry best practices.
    • Sample Representation: The data is typically drawn from a representative sample of relevant industries, businesses, or consumers. This ensures that the findings are statistically valid and reflective of broader trends.

    2. Expert Insights

    • Industry Expertise: PYMNTS employs a team of seasoned analysts, researchers, and experts who have deep knowledge of the payments, fintech, and commerce sectors. Their expertise ensures that the data is not only accurate but also contextually relevant and insightful.
    • Collaborations with Industry Leaders: PYMNTS often works with leading players in the industry, including banks, fintech companies, and payments processors, to gather data. These collaborations add authority and real-world relevance to their findings.

    3. Transparency and Clear Methodology Disclosure

    • Methodology Transparency: PYMNTS makes it clear how data is collected, processed, and analyzed. By providing detailed information about survey designs, sample sizes, data sources, and analytic techniques, they allow users to assess the integrity of the research.
    • Documentation of Assumptions and Limitations: PYMNTS openly shares any assumptions made during data collection or analysis and the limitations of the data. This level of honesty enhances trustworthiness, as users know the context in which the data was gathered.

    4. Data Validity and Accuracy

    • Cross-Validation: PYMNTS data often undergoes cross-validation, meaning it is verified by using different sources or comparison methods. This ensures that the insights are consistent across various data points and methodologies.
    • Accuracy Checks: Before publication, the data is subjected to multiple rounds of review and verification to ensure it is accurate and free from errors. This quality control process builds confidence in the data’s reliability.

    5. Reputable and Diverse Data Sources

    • Wide-Ranging Data Sources: PYMNTS draws from a variety of reputable sources, including consumer surveys, financial transactions, market reports, and industry-specific research. This diverse set of inputs helps to create a well-rounded and robust dataset that reflects multiple viewpoints.
    • Third-Party Data: They frequently incorporate third-party data and research to complement their findings. This can include data from market research firms, government statistics, or financial institutions, adding an additional layer of credibility.

    6. Consistency and Longitudinal Tracking

    • Long-Term Tracking: PYMNTS tracks trends over time, allowing for longitudinal studies that help to identify evolving patterns and shifts in consumer behavior, business practices, and technology adoption. This consistency adds value as it enables users to compare data over time and spot emerging trends.
    • Reliable Benchmarks: PYMNTS has developed a reputation for providing consistent benchmarks and indices, making it easier for users to assess performance in a broader context.

    7. Independent Research

    • Objectivity: PYMNTS is known for conducting independent research, free from external influence. This objectivity ensures that their data and insights are impartial, not shaped by the interests of any particular company, group, or sponsor.
    • Unbiased Reporting: The data is presented without agenda, which builds confidence that the insights and conclusions are not swayed by commercial interests.

    8. Peer and Industry Validation

    • Endorsement by Industry Leaders: PYMNTS’ reports and data are frequently cited by leading companies, industry professionals, and the media. When respected figures in the industry rely on PYMNTS for data-driven insights, it lends further credibility to their research.
    • Peer Review: PYMNTS’ data is sometimes peer-reviewed or benchmarked against other well-regarded datasets, further validating its credibility and reinforcing confidence in its accuracy.

    9. Real-Time and Actionable Insights

    • Timeliness: PYMNTS often produces real-time or near real-time data, particularly in fast-changing sectors like fintech and payments. This ensures that their insights are not only relevant but also actionable, which builds trust among decision-makers who rely on up-to-date information.
    • Actionable Data: The data is presented in a way that helps businesses make informed decisions. The insights are often contextualized with actionable takeaways, ensuring that users can trust the data to guide their strategy.

    10. Comprehensive and Holistic Approach

    • Multi-Dimensional Insights: PYMNTS provides data not just on numbers and trends but also on the underlying drivers behind those trends. Their reports often include qualitative insights and expert commentary alongside quantitative data, giving a full picture of the situation.
    • Holistic View of the Industry: By focusing on a wide range of industries, from payments to healthcare, retail, and beyond, PYMNTS offers data that considers the broader ecosystem, making it more relevant and credible to a wide audience.

    In summary, PYMNTS Intelligence data is trusted because it is based on rigorous methodology, expertise, transparency, and consistency. These best practices ensure that their data is not only accurate but also relevant and actionable, allowing businesses, researchers, and policymakers to use it with confidence.”

    The website is being updated as we speak!  Seriously, that was pretty awesome to read because that is the outcome we work hard to achieve. Surveys are designed to avoid leading questions. Samples are painstakingly structured to match the populations being studied. Respondents are validated to exclude bots and low-quality responses, eliminating as many as 40% of respondents as not being human, someone misrepresenting themselves, or not providing credible responses.  Results are checked against multiple reputable sources. Methodologies and limitations are disclosed in every report.

    I believe that’s why PYMNTS Intelligence data is used by banks, corporates, FinTechs, retailers and policymakers to make strategic decisions. They trust our process, and by extension, they trust the output.

    Why Data Integrity Is the North Star

    Data can be provocative. It can challenge what people think they know. That’s healthy. But only if the work can stand up to scrutiny. Collecting and cleaning data is only half the job. The other half is reading it right. Every data point lives in a moment. Good interpretation respects those limits.

    Over more than a decade of producing research and publishing our key findings, we’ve produced work that has challenged conventional wisdom. But at the same time, introduced a data-driven point of view offering balance and a new angle to the topics shaping the digital economy.

    Measuring the Paycheck-to-Paycheck Economy

    When we began tracking how consumers self-report their financial lifestyle in March 2020, we created the first ongoing trendline showing that two-thirds of consumers live paycheck to paycheck. And, boy, were we challenged by a lot of very prominent voices. But our methodology made an important distinction the headlines never made. Some live that way by choice, structuring their spending and savings around regular income, while others — about 28% of those living paycheck-to-paycheck — do so by necessity, struggling to cover bills.

    That nuance mattered. Without it, the conversation about financial stress risked becoming one-dimensional. Our clarity in definitions and sampling allowed businesses, policymakers and the media to talk about solutions tailored to different realities because our data made the difference visible.

    Apple Pay and Mobile Wallet Adoption

    When the buzz around Apple Pay and mobile wallets suggested rapid adoption after Apple Pay launched in October of 2014, our studies showed something different. In-store use struggled to ignite, and a decade later has largely flatlined. Today, the slow in-store curve is widely accepted. And our early numbers are now the historical record.

    BNPL and the “Pay Later” Economy

    For years, the dominant narrative around Buy Now, Pay Later was that it was a product for the credit-challenged and they’d get into big credit trouble for using it. Our data was the first to show that adoption was much broader. That even Prime consumers and higher-income households were using BNPL, including for everyday purchases like groceries. The insight that paying for groceries in installments was not fundamentally different from using a credit card reframed how the product was discussed in the market. So did data that shows that most BNPL users, like most credit card users, are good stewards of the credit because they value having access to it. It also influenced how providers positioned BNPL to mainstream customers and how other media now cover the topic.

    The Cost of Uncertainty to Business

    In January 2024 we began tracking the cost of uncertainty for middle-market companies, well before tariffs made the headlines. Our surveys and data models quantified the effect uncertainty was having on investment, hiring and pricing decisions. Then, not knowing for sure cost businesses roughly 4% of annual revenue.

    Eighteen months later, with tariff-related disruptions now in play, we can show how businesses have shifted behavior over time, and how the cost of uncertainty is a driver of those decisions. In terms of dollars and cents, the cost of high levels of uncertainty is double that for the average firm, and for consumer products companies it’s even higher. That foresight means our readers didn’t just see the current moment; they understood how we got here and how those trendlines might shift moving forward.

    How the World Does Digital

    A quarterly 11-country study of 216,679 consumers since 2022 shows the impact of connected devices and apps on how “digital” the world is. We created an index measuring “digital days” that tallied how many digital activities consumers across all demographics do during their day. We learned many things. Some not so surprising (like older generations are the laggards) and some pretty unexpected, like telehealth usage dropped after COVID, and the French have the most engaged Gen Z population of any country studied.

    And that an emerging economy, Brazil, is the most digitally engaged country of all those we studied, including the U.S., U.K., five countries in the EU and Singapore.

    The Gen AI Impact Index

    A year ago, we started measuring the impact of generative AI on business. What we found surprised even seasoned executives. Enterprise C-suites were already using Gen AI to inform some of their most strategic decisions even if they couldn’t pinpoint a direct ROI. Our methodology captured this not through vague sentiment questions, but through detailed probes about use cases, decision areas and organizational adoption levels. The result was an early warning that Gen AI was not just a tech story. It was a strategy story.

    Trust but Verify

    The COVID-19 pandemic was a once-in-a-century shock to the global economy. Tariff changes and trade disruptions over the past few months represent the most significant shift in global commerce in nearly a hundred years. Shocks of that magnitude make traditional, rear-view-mirror data subject to change as data collection and reporting chases the reality of consumer and business behavior.

    In 2020, consumer habits changed in weeks. Physical shifted to digital literally overnight. Supply chains were stressed. . Industries that had been stable for decades experienced swings that rendered old baselines irrelevant. Measuring shopping behaviors by the channels consumers shopped became irrelevant. More important was the degree to which their digital shift would stick once people felt safe to return to the physical world.

    The same dynamic is now unfolding in 2025 with the reconfiguration of global trade flows, as tariffs and supply chain realignments ripple through manufacturing, retail and logistics. The lack of certainty for businesses shifts investment and hiring decisions. For consumers, even those with job stability, spending and savings patterns shift.

    Real, reliable data powers the modern economy, enabling businesses to innovate, personalize and predict.

    At the same time, the rise of fraudulent data and the overwhelming complexity of interpreting vast data sets require a careful balance. Businesses must not only rely on high-quality data but also seek to understand the underlying motivations that drive consumer and business behavior, ensuring that their data-driven decisions go beyond simple transactional models to offer deeper, more meaningful insights.

    Over the years, we’ve learned that data integrity comes from a blend of science and art.

    The science is the methodology. The art is the interpretation. Transparency is non-negotiable.

     

    How do you see it?

    Until NEXT time.

    Join the more than 14,000 subscribers who’ve already said yes to what’s NEXT.