Justia columnist and U. Washington law professor Anita Ramasastry discusses two controversial online business practices: steering, and differential pricing. Steering, which the travel site Orbitz has used, directs potential customers to options that they may be likely to choose, based on other information the site knows about the customer — for instance, whether he or she is using a pricey Mac, or a less expensive PC. Differential pricing, as its name suggests, occurs when different customers of the same website receive different prices for exactly the same item, due to information that the site knows about the customers. Ramasastry suggests that regulators should consider cracking down on these two practices.
Just last week, The Wall Street Journal reported that the online travel-booking site Orbitz.com shows Mac computer users more expensive hotel recommendations than it shows PC users. Why? Apparently, marketing reports indicate that Mac users, who spend more on their computers, are also likely to pay up to 30% more for their hotel rooms. That fact isn’t surprising, as Macs have long been associated with a somewhat richer demographic than PCs have been. But the use of that information is somewhat novel. It is also controversial: In The Wall Street Journal article’s comments section, readers posted hundreds of comments. Some vowed never to use Orbitz again; others weren’t concerned by the practice.
In defense of its policy, Orbitz notes that it is not offering different prices to different customers for the same hotel room, but rather, it is only “steering” Mac users to rooms in more expensive (4- and 5-star) hotels. Mac users can still find cheaper hotel rooms, but they may need to click more, filter their searches differently, and so on, in order to find the same lower-priced hotel that a PC user would initially see.
We all know that offline, we may pay different prices for the same goods and services–for instance, when we buy a car. But that difference stems from in-person bargaining, not from an implicit assumption that is made based on an algorithm that draws upon another set of data about us.
In this column, I will examine the Orbitz controversy and its implications. Orbitz is not the first (and surely will not be the last) web merchant to steer customers to certain options. Moreover, other web merchants are–unlike Orbitz—actually offering customers different prices based on their search habits, customer profiles, and other data about them. This latter practice is arguably more problematic than simply changing the initial options that are shown, based on such data.
This type of price discrimination may be legal, but to many consumers, it feels wrong. In this column, I will discuss why consumers feel that way, and I will recommend some policy prescriptions that could (1) help maintain customer confidence in Internet retailers, while also (2) recognizing that customization appears to be inevitable, as more and more data about individual consumers becomes available to online and other businesses.
Orbitz’s Defense
As noted above, in response to the story in The Wall Street Journal, Orbitz defended its practice, saying that its software is simply showing users what it thinks they will want to see. Consider Orbitz CEO Barney Harford’s blog comments on the topic: “We’ve identified that Mac users are 40% more likely to book a four- or five-star hotel than PC users. A similar skew applies for iPad users.” Harford further explained, “On our website, once you get to the page for a particular property (let’s call it “Hotel A”) we show consumers a list of alternative hotel recommendations. This list is primarily made up of nearby properties that were ultimately booked by customers who had also viewed Hotel A. That’s a pretty useful feature already, but we’re then able to personalize that list by taking into account factors such as whether we see that the user is using a Mac or a PC.”
Harford pointed out that Orbitz also provides families with kid-friendly recommendations. For example, if a user searches for, say, Orlando hotels, with kids as part of the reservation, then kid-friendly hotels will come up first. Orbitz users on, say, a Disneyland trip might find such suggestions helpful.
Finally, Harford emphasizes that on Orbitz, prices for any given hotel room are the same for all customers. But he also admits, “We know that 90 percent of customers book a hotel from the results they see on the first page of results (typically the first 25 hotels). Furthermore we know that 50 percent of customers book one of the top five properties we show, and a remarkable 25% book the top sorted property.”
The Wall Street Journal found that Orbitz searches in a number of major cities showed the same hotel recommendations results for Macs and PCs. But Orbitz searches in other cities, such as Miami Beach, yielded pricier boutique hotels for Mac users on the first page of results, whereas different, cheaper hotels were shown on the first page for PC users. And, “Overall, hotels on the first page of the Mac search were about 11% more expensive than they were on the PC,” the newspaper pointed out.
Price Customization: How Does It Work? And, Is It Legal?
Price customization and dynamic pricing–that is, offering different prices to consumers based on their own profiles, search habits, and demographic data—is a growing trend. Large companies (e.g., United Airlines) are using such software. But most of these companies are silent about their practices, as they do not want to publicly highlight that they give different customers different options.
Why are companies investing in this type of software? The answer is simple: to increase profits. According to experts, allocating discounts with price-customization software typically yields two-to-four times as much revenue as offering the same discounts at random.
And just how can a software algorithm decide what price to offer, or what recommendations to make? One way is to monitor how quickly shoppers click through to an online retailer’s checkout or payment page: Those who get there more quickly, and are more ready to buy, need not be reeled in with a discount or lower price.
Andrew Fano, a consultant in Accenture’s Chicago office, reported in The Economist that at least six of America’s 10 biggest Web retailers customize their prices in some way. He also noted that the practice is hard for shoppers to spot. Customers would have to spend a lot of time comparing their search results with those of friends and colleagues to uncover hidden customization.
Notably, price-customization is not a new practice. In September 2000, Amazon.com outraged some customers when its price discrimination was revealed. The practice was made public when a buyer reportedly deleted the cookies on his computer that identified him as a regular Amazon customer, and then watched the price of a DVD he’d been offered drop from $26.24 to $22.74. Amazon said the difference was the result of a random price test, and offered to refund the difference to customers who had paid the higher prices.
Apparently, Amazon has experimented with such random price tests more than once: Also in 2000, consumers discovered that Amazon was using dynamic pricing, when customers comparing prices on a “bargain-hunter” website discovered that Amazon was randomly offering the Diamond Rio MP3 player for up to $51 less than its usual $233.95 price. The resulting backlash prompted Amazon to refund the difference to those who had paid more. Today, Amazon declines to discuss its pricing system.
Will Information Gleaned from Social Media Influence the Prices That Customers Pay in the Future?
Companies that use price-customization software have not yet used social-media profiles and posts in their pricing models–likely out of fear of consumer reaction to perceived privacy infringement. But in the future, that situation may change: Companies may check social media to tell if we are sports fans, wine lovers, wealthy doctors, poor students, and so on, and use that to price diverse products and services.
According to news stories, call-center operators are beginning to scan Twitter for information on the shoppers to whom they are speaking. Sometimes, shoppers’ tweets may provide useful hints about whether a customer needs a discount before they are willing to place an order or whether they are wealthy enough to pay full price.
Why Does Orbitz’s Process of “ Steering” Feel Morally Wrong to Many Consumers?
Brick-and-mortar retailers have also been known to keep track of in-store shopping habits. And some even will look into their shopping histories to find out, say, when a customer is having a baby, so that they will know exactly when to start bombarding her with relevant promotions. I recently wrote, here on Justia’s Verdict, about how Target keeps track of our spending habits in order to send us coupons that are relevant to our lives. Some may find getting relevant coupons convenient, but others worry about privacy and about retailers’ creating and then using profiles of us.
Regarding the Orbitz situation, customers may feel that they are unfairly being lumped into a large category (e.g., Mac users versus PC users) and thus not being treated fairly by the retailer.
After all, even if prices are the same, “steering” may ultimately constrain our choices. The top few hotels shown on a search engine website are crucial: As noted above, according to Orbitz, a full half of its customers book one of the top five hotels that are shown on the screen, and 25% book the top hotel displayed.
Orbitz has a strong argument for using customer data: When certain customers have been shown to be willing to spend more on nicer hotels, Orbitz would be foolish not to present the most costly hotel options first to those customers. The argument is that Orbitz is simply showing these customers the options they are most likely ultimately to opt for and enjoy.
Are Expectations of Fairness Being Thwarted?
Still, even if “steering” is good business, it may evoke feelings of unfairness. When we search online–especially on sites that provide low-price guarantees or other promises –we expect to be treated equally and fairly. We also don’t typically expect to be sorted or singled out based on our choice of computer. Moreover, our choice of computer may not even reflect our own preferences–but rather those of our employer, if we happen to be booking travel while at work, or those of our spouse or friend, if we happen to borrow a computer.
Is what happens online different from what happens when we walk into a store? Again, some of the difference lies in our expectations. We know that when we walk into a certain kind of store, or onto a car lot, we will be able to bargain with a salesperson–and that the price we pay for many kinds of items (cars, used goods) may not be identical to what others pay.
eBay is another good example: We know that eBay pricing occurs in an auction setting, where auction results for identical items may vary based on who bids and how much the bidder wants the item. We also know that, on eBay, there are sometimes “Buy it Now” options, and that we can put in a “Best Offer.” But eBay is more transparent than other sites (such as Orbitz), as we can watch the bidding as it happens, and check out the prices that identical or similar items are commanding elsewhere on the site.
Price Differentials Based on Seller Risk May Seem Fairer than Price Differentials Based on Other Factors
We also know that our credit scores and health histories have been used to create differential pricing for insurance premiums, loan interest rates, and more. But these types of pricing differentials are theoretically based on differential risk.
Over time, we have learned how our credit reports and medical histories are being used by insurers and lenders to charge us different premiums and interest rates. But even once we all knew some of what was going on, regulators and consumers had to press for greater transparency in the process.
Consumers now have better access to their credit reports, for example, and are better informed as to how their scores impact the credit terms that they are offered.
Is it Ever Against the Law to Provide Differential Customer Pricing?
Currently, price discrimination is legal except when it is based on race, sex, national origin, or religion. If a merchant offered higher prices to men or to women, for example, this would run afoul of anti-discrimination laws. But it is still legal to discriminate in price based on whether you drive a fancy car, or whether you own a Mac.
Still, lawsuits in this area might be possible if retailers (including e-retailers) were to run afoul of their own promises or guarantees, or to otherwise engage in unfair or deceptive trade practices.
Moreover, the way in which retailers use consumer data to make predictions and change prices should involve greater transparency. Consumers should know why retailers are charging different customers different prices. And they need to know, more fundamentally, that retailers are, in fact, charging customers different prices, or are making recommendations that are based on various criteria. Without that knowledge, customers may proceed with a transaction falsely believing that they are getting the best price available, when that is not really the case.
What Can Be Done about Price Customization?
Dynamic pricing on the Internet has been around for more than a decade. However, most consumers do not understand the extent to which different websites are implementing such customization. Amazon’s price-customization controversy occurred in 2000. Now, in 2012, we are learning of Orbitz’s current practices regarding what options are shown to consumers. What else happened in the interim? And what else is going on right now?
If consumers start to assume that all websites are engaging in such behavior–unless a company indicates otherwise–then they can start using self-help to improve their own chances of securing good or favorable prices. For example, consumers can spend longer searching a particular site–and can leave a site without purchasing anything, only to return later, to try to “game” the algorithms that worry about consumers who don’t buy quickly, or don’t buy in a given visit, and offer such consumers better prices.
Consumers can also make sure to sort searches for the lowest-priced recommendations, to ensure they have not been steered only to higher-value items, as occurred with the Orbitz Mac users. And, consumers can also stop leaving as much data about themselves online–or even search using different computers. But all of this is like guerilla warfare: using stealth techniques to try to circumvent a site’s software to ensure that it will not steer you to expensive items or charge you higher prices.
Rather than being forced to use our energies trying to circumvent software, it would be better for us to know up front about a site’s practices regarding which items are shown first, and, more importantly, regarding price-customization.
Federal and/or state regulations could be promulgated that require sites to make at least minimal consumer-protection disclosures as to what type of information is used to price items or services, and whether pricing customization occurs on the site. This information can be directly relevant to consumers’ decisions as to whether to do business on a particular site.
As I have noted in prior columns, context is important: It is important for consumers to know not just what information is being gathered about them, but also how it is being used. The White House has recommended a new privacy bill of rights that would require retailers to take context into account when compiling and sharing personal information. I’d like to see that measure passed into law.
In the past, when we provided data to a site via our search terms, or via our choice of browser, we did not expect that sharing such information would lead to adverse consequences in the pricing of goods or services that we might want to purchase. Now, we need to take into account that possibility, and regulators need to look at price-discrimination practices to ensure that consumers are treated fairly.