By Anuradha Gandhi and Rachita Thakur
Introduction
Recent consumer complaints and various experiments have sparked a debate about fairness in digital pricing strategies. Union Consumer Affairs Minister Pralhad Joshi revealed that the Central Consumer Protection Authority (CCPA) has issued notice to tech giant, and ride hailing platforms following concerns over software performance and pricing inconsistencies. It showed different price for android user and iPhone user of same product or rides.[1]
These discrepancies sparked growing discussions about transparency of dynamic pricing models and the potential for unequal treatment across platforms. Research shows that 79% of consumers are concerned about how companies use their data.[2]
What is Dynamic Pricing & how does it impacts consumer?
Dynamic pricing is product pricing based on various external factors, including current market demand, the season, supply changes and price bounding. With dynamic pricing, product prices continuously adjust sometimes in minutes in response to real-time supply and demand.[3]
Fluctuation in consumer demand play a significant role in determining prices, higher the demand often leads to higher prices, which is also termed as surge pricing. Seasonal variations, holidays and other trends can impact pricing, like if it’s raining the prices are high compared to when the weather is normal.[4]
Dynamic pricing vs. price discrimination
Dynamic pricing requires retailers and brands to have access to real-time supply and demand data which is important for tracking the market and competitors. They also need to have the ability to collect, analyze, and act on first- and third-party customer data to understand their unique customer segments.
On the other hand, Price discrimination is the situation whereby the price of a product or a service varies depending on the personal information of users which the supplier of the product or service has made available involving charging different prices for same product to different customer.
The illustration demonstrates the number of iOS, android users in 2025 and monthly users of ride applications[5] as on December 4, 2024, which signifies that 4% of population is being discriminated because of their choice of using an iPhone.[6]
How does price algorithm works?
The implementation of price discrimination relies heavily on extensive data collection. Price algorithms determining these prices are ubiquitous in the online environment, where merchants are able to process unprecedented amounts of personal data and generate complex profiles of consumers. Price discrimination signifies using information that reveals individual characteristics in order to determine monetary costs and algorithms present the technological opportunity to track and monitor individuals for such purpose. This is enabled by the large-scale collection, aggregation, and analysis of personal data, or in other words the use of ‘Big Data’.[7] Algorithm pricing works as follows:
An online store identifies a customer for instance by means of a-
- Cookie: Digital tracking via ‘cookies’ and ‘digital breadcrumbs’ allow firms to analyze consumer behavior data and to decipher personal characteristics and preferences, to implement (almost) perfect price discrimination by identifying a customer’s reservation price, the individual willingness to pay.[8]
- IP-address
- User log-in information
- Browsing behavior
- Order history
- Economic indicators
- Device type and services used
- Demographic Location
This unique identification is generally not an end in itself, but allows the retailer to either infer the consumer’s willingness to pay for a certain product or to distinguish between broader categories, e.g. high and low spending consumers.[9]
A famous example of personalized pricing is the case where the ACM (Dutch Authority for Consumers and Markets) took action against Wish, an online e-commerce platform for deceptive pricing practices. The company used personalized pricing, which varied based on factors like consumer behavior and location, without properly disclosing this practice. Businesses can use personalized pricing under strict transparency conditions only. ACM confronted Wish with its practices and in collaboration with the European Commission and other European consumer authorities, monitored the situation to ensure Wish upholds its commitments.[10]
Ethical concerns
Price discrimination raises significant data privacy concerns, especially when it involves collecting and analyzing personal information. Some of the key concerns are-
Collection of personal data: to implement price discrimination, businesses often gather detailed data on consumers, including demographics, purchase history, browsing history and location. This raises the question of how much personal information is being collected and whether consumers are fully aware of the data being gathered.
Informed consent: many companies may not adequately inform consumers about how their data will be used in setting prices. Without clear consent consumers may unknowingly agree to share sensitive information, leading to privacy violation.
Targeting vulnerable groups: price discrimination based on personal data can disproportionately affect vulnerable populations.
Data security risks: the large amount of sensitive data involved in price discrimination practices, if not properly secured can be targeted for cyberattacks. Personal data leakage, such as the exposure of financial information or consumer habits can have devastating consequences for users leading to identity theft.
Transparency and fairness: price discrimination can sometimes be perceived as unfair, particularly when consumers are unaware of why they are being charged higher price. The absence of transparency in how data is utilized to set prices can undermine trust in business and fairness in the market.
Impact of behavioral tracking: advanced tracking technologies, such as cookies and tracking pixels allow businesses to monitor individual consumer behavior across various websites. While this can enable more personalized pricing it also raises privacy concerns since users are not fully aware of the extent to which their online activities are being followed and utilized.
Surveillance capitalism: the practice of private companies tracking individuals’ online behavior to collect their data for profit. This practice is usually hidden from the public and takes place in a legal gray area is another ethical concern which helps companies build detailed consumer profiles that extend far beyond reasonable expectations.[11]
Data privacy and Competition Law
India’s ride-hailing market is highly competitive, with local players and global giants holding significant market shares. The vast amounts of user data collected by these platforms can be seen as an asset, but it also raises concerns about potential anti-competitive practices. In the case of Re: Updated Terms of Service and Privacy Policy for WhatsApp users[12], the Competition Commission of India (CCI) noted that WhatsApp “take-it-or-leave-it” approach to data collection, which forced users to accept expanded data sharing terms, was an abuse of its dominant position.
Platforms relying heavily on data can gain significant market power, creating entry barriers, excluding competitors or engaging in discriminatory practices, thus harming competition and consumer welfare. The data-driven practices of a dominant firm could trigger privacy issues under data protection law and also constitute an abuse of market dominance under competition law.
The case of WhatsApp abusing dominant position in market highlighted concerns about how companies might also be exploiting their dominant position in the market. These platforms have been accused of infringing on user privacy by tracking personal data such as battery percentage.[13] This practice not only compromises user autonomy but also indirectly contributes to data profiling as companies collect and analyse personal information to influence behaviour. Furthermore, users may feel coerced into booking rides due to the limitation imposed by these platforms, where options become less accessible or more expensive unless they comply with the company’s practices. These types of behavior can be seen as manipulation of consumer choice, as users are subtly nudged to book rides, even if they not want to.
Data protection law and pricing algorithm
General data Protection Regulation (GDPR):
European Union under article 15(h) provides an information obligation, meaning that an access request in the context of a pricing algorithm must be accompanied by information about the how algorithm works and possible consequences for the data subject. The right of access is intended to allow data subjects to check the fairness and transparency of data processing related to them.
Further under Article 22, GDPR talks about human oversight on automated decision making which puts on the controller the obligation to secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons.
California Consumer Privacy Act (CCPA): Section 1798.100 of CCPA provides for general duties of businesses that collect Personal Information, it grants consumers the right to request information about the data that business collects about them, including the purpose for which the data is being used.
Lei geral de proteção de dados (lgpd), Brazil: Section 20 provides for right to review decision based on automated processing of personal data it provides data subject with the right to request review of decisions based solely on automated processing of personal data which affects his/her interests and decisions defining his/her personal, professional, consumer and credit profile or personality aspects.
Consumer Protection Act, 2019: Central Consumer Protection Authority (CCPA) is in place in India to combat unfair trade practices leading to differential pricing practices and enforcing penalties on violators.
Digital Personal Data Protection Act, 2023: The law mandates data principle to have right to access information from data fiduciary on what data is collected and purpose of collection, to ensure personal data is handled with respect and transparency.
Conclusion
As digital marketplaces continues to evolve, the challenge lies in fostering innovation while protecting consumer’s interests. The CCPA’s intervention marks a significant step toward establishing clearer guidelines for digital pricing strategies and address algorithmic pricing like European Union has accommodated the same under GDPR. Further to balance business objectives with ethical standards, companies must prioritize data privacy, ensuring consumers are fully informed and their information is safeguarded.
Abhishekta Sharma , Junior Associate Advocate at S.S. Rana & Co. has assisted in the research of this article.
[2] https://www.zoho.com/the-long-game/personalization-vs-privacy
[3] https://www.business.com/articles/what-is-dynamic-pricing-and-how-does-it-affect-ecommerce/
[6] https://www.bankmycell.com/blog/smartphone-market-share-in-india/
[7] https://ejlt.org/index.php/ejlt/article/download/631/853?inline=1#_edn13
[8] https://link.springer.com/article/10.1007/s10551-019-04371-w
[9] https://www.europarl.europa.eu/RegData/etudes/STUD/2022/734008/IPOL_STU(2022)734008_EN.pdf
[11] https://www.britannica.com/topic/surveillance-capitalism