HT Score Scheduler Increased Frequency Through the End of the Awards

As we round the corner into the last month of the HotelTechAwards, we have increased the HT Score scheduler to run twice a day (typically runs only once a day).

This will provide participants more visibility into real time rankings day-to-day without having to wait a full 24-hours to see updated scoring.


NOTE ABOUT SUB-CATEGORY RANKINGS: Please note that given the amount of dynamic data, rich filters, content and multiple languages the category pages are extremely dense/heavy pages.  As such, the page content is cached weekly in order to improve page load speed for users.

Verified Case Studies Variable

As with all new variables, verified case studies will start out as having a nominal impact on the HT Score in their first year at roll out and will increase in importance over time to ensure that all vendors have time to prepare for the update.

How Do Case Studies Help Buyers?

Verified case studies are incredibly important towards HTR's mission of delivering an unmatched hotel technology search and selection experience for hoteliers around the world. It's also the first step towards increasing benefits based selling throughout the site to help hoteliers visualize and imagine the possibilities of leveraging digital products by seeing real examples of their peers who are using them and the outcomes they are achieving instilling urgency, trust and FOMO to drive adoption of tech in the global hotel industry.


Can Case Studies Determine the Leader in a Category?

While case studies are a nominal part of the overall HT Score (<3%), they can be the determining factor in a close race where two companies have maxed out most other variables similar to how this can happen with Google's Search Algorithm.  Here is an example of how this can happen:

Example Scenario and Analogy of Google Page Speed Algorithm Update

  • Imagine that 2x websites with amazing and mostly similar domain authority each have an excellent blog post about 'gardening tulips'
  • Imagine blog post A consistently ranks ahead of blog post B but has really poor page speed and user experience metrics.
  • When Google rolls out the user experience update, despite ranking better across other metrics, post B could move into the lead.

If you notice your rank change in a category where you have close competitors in the rankings this can be easily resolved by just making sure that you at least have some verified case studies published for buyers.


New Sub-Categories Pages

As of this week we're rolling out redesigned sub-category pages.  These pages will serve as hubs for relevant content about each sub-category with new filters to help users move beyond a single top solution recommendation and instead help users identify the solution that is right for them.


Key changes: 

  1. 🎯 Segment based recommendations & buyer personas
  2. ⚙️ New data filters to give users the power to find the right fit
  3. 📖 Unified content hub of helpful resources


🎯 Segment Based Recommendations & Buyer Personas

There is no best overall vendor in any given category nor has there ever been just as there is no single best hotel asset.  But to make comparisons amongst hotels asset managers and buyers use metrics like REVPAR and similarly HTR uses the HT Score.  

The reason for this is that popularity is not a strong indicator of fit (or it would just be a bigger is better competition).  Having said that, popularity does tend to be a strong indicator of fit when applying segmentation (ie. most popular for resorts, most popular in Europe, etc).  For example, an mobile ordering solution that is recommended by lots of resorts is more likely to be a good fit for a resort looking for mobile ordering than a vendor who is not recommended by other resorts.

For this reason we have created the new buyer personas that outline hotel types that tend to have unique attributes which lead them to make similar tech purchasing decisions.

At a glance users can now quickly see that there is no one size fits all solution but rather there are different tools that serve different types of properties with unique need.  


⚙️ New data filters to give users the power to find the right fit

The new data filters enable HTR to move beyond a single overall ranking and instead allow users to quickly refine the rankings based on their own criteria including hotel size, city, region, hotel type or even PMS.


📖 Unified Content Hub of Helpful Resources

The new category pages are jam packed with helpful resources for buyers giving them tons of free tools, resources and content all related to the category they are researching including: 

  • Recommended articles
  • Free report downloads
  • Category buyers guide
  • In-depth deep dive (overview, benefits, features, pricing, implementation and FAQs)
  • Related categories

Poseidon Algorithm Final Stage

Per the October '21 announcement, the Share of Voice variable went live in vendor dashboards but was not to impact scoring until after the 2022 HotelTechAwards winners were announced to ensure that (a) scoring didn't change during the competition and (b) all vendors had several months to prepare for the update to pre-empt any potential negative impact to rankings.  

Later this month the Share of Voice variable will be given weight in the algorithm along with increased weight for review consistency which will therefore begin impacting rankings.  

To check your Share of Voice or Review Consistency head to the Reputation Report Card.

Want to learn more about the update? Learn more

(CORRECTION) Awards Year in Review Email

Earlier this morning HTR sent out the annual year in review notification email to all HotelTechAwards participants recapping your annual campaigns.

In order to send this email HTR exports data from the frozen database as of the 12/15 5pm awards deadline and maps the 27 merge fields and data points via Zapier to populate and send the automated notifications.    

Shortly after sending the notification, we identified that one of the 27 fields was not mapped correctly and was fetching from the incorrect column which was the 'final score points value' (shown below).

Please note that this is the only of the 27 variables that was not mapped correctly and in no way impacts your ranking in the awards.  The data to audit in the email is all of the inputs of the raw points score which were all correct (eg. reviews, integrations, partner reccs, employee survey completions, etc). That said, if you would like to receive your corrected/true final points score from the frozen database please reach out via the live chat on site.

We apologize for any confusion this may have caused and please feel free to reach out with any questions you may have.



Review Incentive Policy Community Memo & Updates

UPDATE (3/15/21): Please note that after surveying the community for feedback, pay-per-review will continue not to be permitted on HTR (details below)

TL;DR

  • Several pay-per-review incident reports.  We have become aware of several vendors have been offering pay-per-review without knowing that this is against HTR’s incentive guidelines
  • This is against HTR’s guidelines and the rules of the awards.  While this is a level I infraction (least severe), it is still against HTR’s policies and therefore not permitted, in violation of the rules of the 2022 HotelTechAwards
  • Penalties for current HotelTechAwards competition. Vendors caught in breach of this policy will have the reviews from the trailing 30-days be unpublished and will not be counted towards the HotelTechAwards competition as it is against the rules
  • Several vendors have claimed not to be aware of this aspect of the incentive policy.  Several vendors have pointed out that the confusion lies in the fact that other review sites do permit this type of pay-per-review incentive
  • Implications for incentives on the community.  While these reviews are against the rules of the awards competition and HTR’s incentive guidelines, HTR will amend its policy effective January 15, 2021 to permit pay-per-review with fair disclosure in accordance with FTC guidelines.  Incentivized reviews will be clearly designated in the UI and ratings will not factor into overall averages.

HTR Vendor Community,

It has come to our attention that a handful of vendors were unaware of HTR’s policy not to allow direct incentive reviews on Hotel Tech Report so we wanted to reach out to all vendors and clarify HTR’s policies about incentivized reviews to make sure there is no confusion.

Overview of Current Policies & Perspectives on Incentivized Reviews

Since inception, HTR’s policies around incentivized reviews have remained the same which can be found in more detail via the help center.  In summary, HTR’s perspective and stance on incentivized reviews to date has been: 

  1. Reviewing B2B software is not the same as consumer goods.  When a guest is reviewing their hotel on TripAdvisor for example there are intrinsic motivations like scrap booking to catalogue their journeys, building online clout/credibility or sharing an experience with friends.  Reviewing B2B software is quite different in nature and carries lower intrinsic motivation which is why incentives are a helpful tool to gather more feedback and foster knowledge sharing.  Incentivizing a review of B2B software is similar to paying for a survey completion - compensating for a busy individual’s professional time without biasing the content of feedback.
  2. Incentives as a way to thank customers are encouraged.  Incentives are common in the B2B review space.  Companies are asking their users to take time out of a busy work day to help with their sales and marketing by advocating for their product. As such, offering an incentive is not only acceptable but we actually encourage it.  So much so that we even subsidize incentives for Premium Members and have an incentive matching program for basic members
  3. Outreach may never be phrased in a way that introduces bias.  While incentives are encouraged, there are obviously guidelines to follow to ensure that the way the incentive is phrased to the user does not introduce bias into their review.  Specifically, to date HTR has outlined several permitted and not permitted scenarios which vary in level of severity and of which examples can be found below:


Scenario

Explicit bias

Implicit bias

Policy

Infraction Level

1) Randomized giveaway

No

No

✅  Allowed

None

2) Limited giveaway

No

No

✅  Allowed

None

3) Pay-per-review

No

Yes

❌  Not allowed

Level I

4) Pay-per-POSITIVE review

Yes

Yes

❌  Not allowed

Level II


Vendor Infractions & Penalties for Pay-Per-Review

There have been three separate occurrences of vendors in the community being reported for pay-per-review (scenario 3) which is a level 1 infraction but none-the-less, an infraction and against the rules of the HotelTechAwards.  Since the inception of the annual HotelTechAwards four years ago, the rules have explicitly stated that direct pay-per-review is not permitted.  The reported incidents were Level I infractions (scenario 3) and were not found to introduce explicit bias.  Given that there have been several occurrences all claiming that the nuances of the permitted incentives were not clear--vendors identified will not be immediately disqualified from the competition however they will incur the following penalties:

  1. Vendor will have 72-hours to respond to the violation report and show that they have corrected their outreach messaging with screenshot evidence
  2. Reviews collected within 30-days of the correction will be unpublished and will not count towards the annual HotelTechAwards competition.
  3. Vendors who have been reported for breaching this guideline will be marked under close watch for up to 90-days where their account will be monitored closely and reviews will appear in a separate moderation queue where moderators will keep a closer eye on activity.
  4. Vendors in breach will only be permitted to collect reviews via the Review Manager in the dashboard for 45-days where HTR can more closely monitor their activity and outreach strategy
  5. Vendors will receive a strike meaning that any future incidents will incur more severe immediate level II penalties without option to cure


Future Amendments to HTR’s Incentive Policies

UPDATE (3/15/22) After surveying the community (both hoteliers and vendors) we have concluded that HTR will not be changing review guidelines and will continue not to permit pay-per-review.  While this is allowed on some review websites with disclosure, ultimately as a verticalized portal for the hotel industry the rankings and ratings are more critical to maintain accuracy and more importantly elimination of bias to ensure a fair platform that upholds the highest standards of integrity.  Additionally, pay-per-review not only skews/biases consumer ratings, but it also gives larger companies with more resources an unfair advantage.  As a result of these findings and feedback from the community, pay-per-review will continue not to be permitted on HTR under any circumstances as it always has been historically.

Vendors have shared 3 main reasons for the confusion around review compensation rules on the site: (a) HTR allows indirect incentives/giveaways (b) pay-per-review is permitted by other review websites and (c) pay-per-review is common in the B2B reviews space.  

Despite these points, HTR’s current (and historical) policies don’t permit pay-per-review (only randomized giveaways).  After discussing with hoteliers and vendors, we have found the consensus belief that pay-per-review should be permitted if certain conditions are met to disclose incentives to website visitors and hotel tech buyers.

The general governing body for online reviews and issues around consumer affairs is the FTC, which states the following:

  • “Knowing that reviewers got the product they reviewed for free would probably affect the weight your customers give to the reviews”
  • “reviewers given free products might give the products higher ratings on a scale like the number of stars than reviewers who bought the products.”
  • “If you've given these customers a reason to expect a benefit from providing their thoughts about your product, you should disclose that fact in your ads.”
  • “If you’re offering them something of value in return for these reviews, tell them in advance that they should disclose what they received from you.”

In light of community feedback and FTC guidelines, HTR will amend our policies after the current year awards competition to allow for direct incentives when there is fair disclosure (effective January 15th, 2022).  Disclosures will be incorporated into the hotel tech buyer user experience on the Hotel Tech Report platform.  So how will this work?

  • The write a review form will include a field for hoteliers to indicate whether direct compensation has been offered for their product feedback
  • Vendors will have a toggle in their dashboard indicating when they are running a pay per review campaign (on or off platform)
  • Vendors found to be running direct incentive campaigns without using the toggle will have all reviews within 6-months of the incident report immediately marked as ‘incentivized’
  • Directly incentivized reviews will NOT count towards the quantitative ratings averages.  Their volumes will count towards the average review thresholds and geographic targets, however, ratings like customer support, ease of use, etc. will not be included in overall ratings in order to ensure “apples to apples” comparisons amongst all vendors for buyers without the possibility of potential bias

Conclusion

Ultimately, incentives are a positive and encouraged mechanism to thank customers for their time, effort and expertise but all endorsements must reflect the honest opinions or experiences of the endorser (Source: FTC Endorsement Guides).  

Therefore while HTR may (or may not) in the future expand its incentive guidelines to permit direct incentive reviews, these reviews would need to carry a clear designation to consumers about the nature of the incentive.  Additionally, companies who have been in breach of this rule to date will receive penalties commensurate with their breach to ensure that (a) the behavior is immediately corrected and (b) reviews collected in this manner will not be counted towards the 2022 HotelTechAwards competition.

Country Code Remapping Patch

HTR recently became aware of an edge case related to the mapping of country codes to regions that was introduced along with the deployment of the Poseidon Algorithm update that we are working on a patch for that will be deployed in the next couple of days (expected go live week of 11/22).

  • Impact: Minor.  Impacted only select countries and select reviews.  Some companies may have experienced slightly deflated region and country bonuses if they had reviews from select countries that had mapping issues when the user selected their location via the Google Places API.
  • Issue: Edge case identified where certain country codes were not mapped properly to their region and/or country as it pertains to the calculation in the HT Score algorithm related to global reach bonuses.



What happened?

HTR typically pulls country data from the popular Google Places API (excluding mainland China reviews); however, a select group of countries and regions were found to be “unmatched” in the lookup function causing some vendors to experience artificially deflated country counts in their HTScore indexes.

Which countries were identified to have potential mapping issues?

Below is a list of countries that had anomalous entries in the database and were updated in the lookup function.  One such example is that 517 reviews in the database were classified as originating in ‘Czechia’ while the HTScore algorithm lookup function was searching for the term “Czech Republic” as defined by the Google Places API.

Please note that only a fraction of cases were impacted by these mapping issues in these countries so just because a country is listed does not mean all reviews from that country were not mapped correctly.

List of impacted regions:

  • 'Virgin Islands
  •  'United States of America'
  •  'United Kingdom'
  •  'Turks and Caicos Islands'
  •  'Tanzania'
  •  'Spain'
  •  'Seychelles'
  •  'Saint Martin (French part)'
  •  'Saint Barthélemy'
  •  'Russian Federation'
  •  'Poland'
  •  'Norway'
  •  'Netherlands'
  •  'Netherlands Antilles'
  •  'Myanmar'
  •  'Morocco'
  •  'Maldives'
  •  'Macedonia'
  •  'Luxembourg'
  •  'Korea
  •  'Italy'
  •  'Ireland'
  •  'Greece'
  •  'Germany'
  •  'Gambia'
  •  'France'
  •  'Finland'
  •  'Czech Republic'
  •  'Congo (Kinshasa)'
  •  'China'
  •  'Belgium'
  •  'Bahamas'
  •  'Australia'
  •  'Austria'
  •  'Switzerland'
  •  'Brunei Darussalam'
  •  'Hungary'
  •  'United Arab Emirates'
  •  'Estonia'
  •  'Fiji'
  •  'Indonesia'
  •  'Sweden'
  •  'Cyprus'

Who was impacted and what was the impact?

Most vendors will not notice any difference in their scores or the scores of other products in their categories.  That said, vendors with reviews from the following countries MAY have been impacted; however, not necessarily as these were anomalies and in many cases the review countries were properly classified.  Impacted vendors may have experienced an increase in countries and regions served which would in many cases increase HTScores from understated scores previously visible on the platform.  These changes are extremely rare and likely unnoticeable but some vendors may notice a several point increase in scores as the algorithm begins detecting their accurate country and region counts once the mapping patch goes live.

Improving Relevancy of Multi-Product Reviews

Context: We have received several complaints from members of the community who believe that review syndication is allowing certain companies to collect reviews for lots of their products through review syndication and link pre-filling where in many cases hoteliers are selecting several products but in reality the review and content only pertain to a single product.

The Problem: The problem is that on the one hand, a hotelier who uses multiple products from a single vendor should be able to leave one review.  On the other hand, this makes reviews less relevant and in some cases where >3 products are selected the user will only write relevant content for some of the products.  

The Solution: Ultimately there are two sides to the argument and there is no right answer but rather its about balancing a seamless and easy experience for hoteliers to review products while also implementing constraints to maintain content relevancy.  Therefore we have implemented two key changes: 

  • Keyword tagging: The review form now will display keyword inspiration to users based on the categories they select and prompt users to use at least one keyword per product category to guide the user to provide more relevant and helpful content.
  • 3-product maximum: While hoteliers can still review multiple products, they cannot review more than 3 products to keep their review focused and ensure that they include relevant feedback and content for each of the products they choose to review.

These changes serve to make reviews more reliable, accurate, relevant and helpful for hoteliers.


Phase III: Share of Voice Update is Live

Update: Share of Voice Implemented

Continued from Phase II: HT Score Poseidon Update is now live

The 3rd and final Phase of the Poseidon algorithm update is now live and can be seen in the Reputation Report Card.

Rollout notes: 

  1. Please note that more than 90% of companies on Hotel Tech Report will not be impacted by this update.  Only companies with lots of reviews from very few hotels are likely to notice any impact.
  2. Please note that while Share of Voice will now display in the Reputation Report card for companies to see their current standings, it will not impact scoring for the 2022 HotelTechAwards and its weighting will be officially rolled out into the algorithm in February 2022 after winners are announced


What is the Share of Voice client bias update?

We’ll use two hypothetical vendors to illustrate the importance of this algorithm patch: Vendor A and Vendor B. Vendor A has 30 reviews from 30 hotels, Vendor B has 30 reviews from 1 hotel client. 

In this example, you can see that while both vendors have the same # of reviews, this is not an apples-to-apples comparison when it comes to credibility and install base since Vendor A has 30 verified hotel clients and Vendor B has only 1. 

The Share of Voice variable will help identify potential client selection bias (aka. cherry picking) that can arise if a vendor is strategically only asking for feedback from a select number of clients versus reaching out to their full install base.

Who will be impacted by the client bias update?

Most companies will not be impacted by this update.  The only companies that will be impacted will be ones who have selectively sought to only seek feedback from lots of users at a small sample of client hotels.

*Note: You can still get a handful of reviews from a single property; however, if all of your reviews are coming from a small amount of properties the algorithm will identify this as potential cherry picking/gaming and your score will be impacted as a result

What can we do to maximize our score for the client bias update?

Just make sure you are reaching out organically for feedback to your client base instead of cherry picking select clients.  Also, take note that while you may certainly collect a handful of reviews from different users at a given hotel--if your strategy has been to try to game the system and collect tons of reviews from a single hotel (or small number of hotels) this will be identified by the algorithm update and your score will be impacted.  Hopefully it goes without saying but...make sure your review collection is organic, representative and diverse and you are not trying to game the system by gathering lots of a reviews from only a few select clients.


Related updates: 


Show Previous EntriesShow Previous Entries