Search Elite in London – Take Aways from the Experts

This past Wednesday, I had the honor of being the MC of Search Elite, the conference run by GoEvents with Jackie Bissell, Andy Brown and Craig Rayner. It was the second Search Elite in London and the third one since it began last year.

We had 9 fantastic speakers and I wrote a recap of some of their key tips which you can implement straight away.

Search Elite Team

 

Digital Knowledge Management – The Key To Success

Duane Forrester kicked off Search Elite by talking about the progress of Voice Search.  Duane said that there are three major breakout groups using voice search. BUT between this year and next year, those with a household income of over 150k USD will not be buying voice devices. This is because they have already done so. However, the groups below this income will be (and doubling their purchases) as the cost of these devices has dropped considerably since they were launched.

Duane said that Alexa supports 22 categories and has integrations with auto manufacturers like Ford. These devices are becoming more and more tied in with our every day life.

Duane talked about 2 brains. Quick answers vs thoughtful dialogue as you can see below:

Duane Forrester

Duane talked about the 5 tribes of Machine Learning

  1. The Symbolists
  2. The Connectionists
  3. The Evolutionaries
  4. The Bayesians
  5. The Analogizers

Businesses are embracing machine learning because they have no choice. There are those deploying structured data and those who are not. Google rewards those who use it.

More interaction points than ever now with new consumer services. You need to make sure you are connecting with people on the right social platforms.

Why is this important?

Go to Amazon’s jobs and ask them about knowledge engineers. There are jobs available. Amazon is investing a lot into Alexa and it will continue to grow.

Alexa is not always listening to you. She is waiting for a specific pattern. For example “Hey Alexa” or “Alexa” or “Amazon”. If Alexa is not connected to the lcloud, then it cannot connect.

Duane then ran through some of the data and content strategies in the photo below:

Duane Forrester Content

What does it mean to optimize your brand for voice search?

Can help rank higher in the SERPs. So what should you do about preparing content?

  • Adopt a long tail conversational phrase approach when producing content
  • An approach targeting the featured snipped (position 0) location works here
  • Content should read like a conversation
  • Content should be successful and thoughtful– thin content, this is a dead end. You have to make sure you go deep on this.
  • Build out the answers to common and uncommon questions.
  • Thinking of the customer as they try to solve the product,
  • Look for ops to be more useful and helpful than your competitors.
  • Structured data – it is important
  • Preparing technically – be secure
  • If people have negative experience on mobile, they are 62% less likely to buy.
  • Be trustworthy and authoratative.
  • Develop actions and skills for your brand.
  • Build a broad voice footprint for your brand – Consumers are rapidly moving in this direction
  • Personas – one of the most critical areas of investment in Voice.

An Introduction to HTTP/2 for SEOs

Tom Anthony from Distilled said that latency made Google develop HTTP/2

Why does HTTP/2 matter?  It is an easy win for SEOs. HTTP/2 is a simple way to get a big performance boost (in a simple way)

HTTP1, this is the current situation – it is the protocol, the language your browser uses to connect to the site as Tom explains below:

Tom Anthony

Tom explained that TCP is like the road we drive on. With http, people can go and look at the truck and see information. Do not want people to go and see this information. With https, there is now a tunnel so cannot see the data. We have changed the tcp layer but not the http layer.

Tom explaining latency

Tom says there are some issues with this:

  1. Small requests/responses still take time.
  2. Every page is made of many many files (many requests)
  3. Mobile connections increase latency

So need to do something about latency.

Tom asked the audience “What does a single request look like? ” Then Tom went and showed us. Sometimes sending many single requests means some requests have to wait.

This is why people made sprite sets, means many requests can be sent at the same time.Another thing people do is use cloud flare and CDNs. CDNs have servers all over the world, so can get the data quicker. Takes 5 or 6 millesecs instead of 100 mill secs.l

This latency motivated Google to create the speedy protocol, now the http2.

  • Now Google are saying they can allow multiplexing. Allows many requests per connection.
  • HTTP2 requests are still the same. We have changed the traffic management system.
  • Requests look the same. The responses look the same, they are just quicker.
  • Http2 allows server push. With server push, we are trying to recuce that last bit of latency.
  • Be careful with server push. There is not a good push in place, whether to know if the browser have this cached or not.
  • Http2 requires https.

Your devs don’t need to do anything. Your server needs to the work. It does not change the content of the trucks. So devs don’t worry about it. It is the change to your server configuration.

CDNs can do it for you Cloud flare can keep the images for you as Tom showed us:CDNs explained with Tom

Does Google notice? The Google bot does not use http for crawling. Cannot expect to see change for search console.

Http2 take aways

  • Can be a quick performance wins
  • CDNs can make the deployment easily.
  • HTTP2 requires HTTPs
  • If doing site audit and want to see if http in place, then go in and enable the protocol column..
  • SPDY was the precursor to http2
  • http1.1 and Http2 can co exist.
  • Moving to HTTP2 is not a migration. (like https)
  • Google bot won’t benefit cause it does not read it

Visual Search: Connecting the Digital to the Physical

Jes Scholz said everything has changed due to smart phone. Humans have been visual creators and so are now smart phones. We are seeing a renaissance of visual marketing, before it could not drive conversions. Now can do both of these, but only if you see the technology.

The vast majority of people are on their mobile phones. If you are trying to understand consumer behavior on laptops, not helping. Use your phone.

Image recognition has changed our online user behvavior.

  • We are familiar with text based search. How can we describe things accuarately words that we see?
  • Google said they will create a visual search image based query
  • So the image itself is the search query and so we can now search for pictures with pictures.
  • Visual search is in fashion. This ability of image search affects other industries like travel and food.

Timescale

  • April 2013 – Google launched released IMAGES. BUT Google didn’t have a good understanding of this.
  • Dec 2016 – they said if you mark up your images with schema .org tags then we will show this. Product details on images was launched.
  • Then Similar items was launched April 2017
  • You can shop the look of images with your celebrity.
  • Style ideas – launched April 2017 – can see how to style the product with clothes.
  • Image badges was launched in Aug 2017. If there is a recipe behind it, will have the text behind it.
  • March 2018 – got image captions. This is to provide users more info behind the image so can get more clicks.

When combine all this together, Google is making a plan to turn image search into product discovery. They are now challenging platforms like Pinterest.

18% of searches have image blocks. This is really easy to leverage for brands.

Do 2 things

  • high quality and original images.
  • Use structured data

 

Jes with image stats

We are starting see search by camera. For SEOs we used to wait for the search to occur, then compete If use search by camera, get the user at the point of desire and take them to the conversion.

Jes Search by Camera

What are the advantages of Search by Camera?

  • One picture is worth one thousand keywords. The user does not need to describe what they are searching for. Also erodes lang barriers.
  • Search by  camera has been popularized by pinterest.
  • You can now take a photo for recommendations online for products you discover.
  • Samsung has integrated Search by Camera into their assistant, Bixby. (and tied in with Pinterest). First time digital network has been preinstalled on an assistant.
  • If click on the shopping icon, directed to amazon. Can find me places nearby thanks for integration with Foursquare.
  • KPIs – 48% more product views, 75% more likely to return. If I can deliver this overnight for ecommerce, will be a hero.

Brands are starting to sit up and take notice of the search by camera. It is more effective to transfer the physical to the digital.

So how do you do this?

  • Simple formula – Dev plus beer and then a one day hacathon. Developing a Minimal Viable Product of this.
  • Can build this in one day using open source technology. Lots of image recognition technology out there.
  • Levera pre-trained image classifiers of train your won.
  • Google Cloud Platform for example.
  • Use cases for Computer vision – what should we be building?

Jess gives us 3 ideas

  1. Snap to shop functionality where can take a picture of someone’s outfit and the computer will find me something similar to that outfit so I can buy it. Jess talked about putting your products on bill boards. Eg the photo of expedia. (never mind no photo)
  2. Snap to sell – if not a shop. Use snap to sell to increase number of listings on your website. Marketplace portal needs certain information from you. Jess can use computer vision to pre populate those attributres. So only thing need to ask user is things unique to their listing, eg how much want to see it for.
  3. Snap for info – you can do this with lots of information. Jess can use snap for info like can use schazam for a song. Local SEOs – if running a restaurant, could put up an ad at the poster and could trigger google maps directions. Can take the customer on a journey.

The core idea behind all this is that the physical world can not only be digitised but also monetized by pointing your camera phone to it. Is your brand integrated with this platform?

Augmented Reality

What if we want to bring the digital into the physical? We can do this with our smart phone cameras. Augmented reality is the blending of visual objects into users view of the real world.

  • For effected Augmented Reality Marketing, need to create meaningful and branded interactions.
  • AR is not a glorified QR code.
  • AR should give you a reason to look through your phone to an enhanced world beyond.
  • Branded filters focus on eyeballs, but we want to produce a good ROI.
  • With AR, you can do a virtual try on with that lip stick and with that glasses, based on my dimensions. The personalisaiton of this is perfect, using the customer as the model, reducing doubt about how the product will look before they buy.

Give life to storytelling

  • Technical infrastructure to interact with the characters of the story,
  • Publishers are interacting with Augmented Reality for this.
  • 2 out of 3 people who are presented with the possibility of seeing something in AR will do so.
  • Walking Dead brought the show to customers (eg using an advertisement at the bus stop).
  • Marketers will drive what gets customers conversions. That is utility.
  • Think in visuals and not in keywords.

How to use voice devices within your marketing strategy

Nick Wilsdon started off his presentation by saying there has been a huge rise of voice devices in the past couple of years.

What are people doing on voice devices? Many people asking  questions with their devices. Nick presented some interesting stats as can be seen in the photo below.

Nick Wilsdon

 

Voice device skills are on the rise. The skills are the code that runs on them. Then Nick went through the anatomy of an Alexa phrase – look at the starting phrases – eg being, launch, load, open, plan, resume.

The core languages of Alexa voice devices. A lot of these phrases trigger the same actions.

Current starting phrase for voice search is limited, followed by open and start.

Skill Vocation Name 

  • Must be easy to remember
  • Cannot be changed once the skill is certified and published.
  • Alexa requires brand ownership documentation for single terms but allows invocation name
  • Google only allow unique term registration.
  • So register your brand term now. If you do not someone might do it instead.

 Utterance

  • It is simple and natural language
  • Use your data to optimise these queries eg google keyword tool, internal search data, SEO tool data, scrapped from Google sheet.
  • Customer support FAQs
  • Pay close attention to your comments on the amazon platform. Early adopters are trying to help make skills better.

What makes a good Alexa strategy?

  • Google is using the featured snippet, the voice search for them.
  • Align with your featured snippets (web strategy)

Aim to reduce friction and increase convenience

  • Alexa, ask Vodafone how much is my bill?
  • Interact with voice platforms and virtual assistants to provide info your customers want.

What is the key then to a good voice device strategy?

Make it an auxiliary channel. It should support activity you are doing elsewhere.

Successful voice strategy complements your other platforms. They are not apps.

Inspiration from Ocado and Just Eat (add items to your shopping list verbally). Just Eat has friction free ordering. Food delivery brands will likely be the first to explore push notifications.

How does this translate into our marketing strategy?

  • How it should not be used. It should not be replacing a mobile app.
  • It should be used as a first contact ordering and to also respond to enquiries.

How to use Voice Search

  • Re ordering and end of product life marketing.
  • Data recollection for users (whats my bill balance) . Free trials
  • Podcasts – 96% of UK listen to radio stations but only 6% podcasts.
  • We need to delight our customers.
  • This is a huge opp for brands.
  • Alexa skills marking place is full of non brand content.
  • Brand have the services and APIS needed with the marketplace
  • Only 15% of FTSE 100 companies have an Alexa skill (Vodafone study may 2018)

Next steps

 

  • Buy voice device so can test
  • Focus on featured snippet and schema for voice search
  • Encourage developers to launch basic skill
  • Register brand tersm across voice platfromcs
  • Develop APOI driven approach to your web darta and services
  • Think how you can complement your services and reduce freiction for customers.

 

AI vs Human Translation in Localisation

Gianna Brachetti-Truskawa gave a detailed presentation about investing in the time with translation.

Gianna knows companies who used automatic translations but it did not work.

There are three main cases need translation

  • Anyone has a lot business across new markets
  • High turn over content like new aggregators
  • People that have a lot of user generated content. UGC – user generated content like all recipes site.

What is translation?

  • It is the communication of the meaning of a source language text by reproducing that meaning through an equivalent target language text.
  • It is the meaning not the words.
  • This is where automatic translation went wrong.
  • Localise is a bit similar, you take into account everything that is surrounding that translation.

AI vs Human Translation

  • Language builds trust. If you want a brand to be sticky, then make you sure you translate it well.
  • They need to feel familiar with your text
  • There are un translatable words too.
  • So to understand where the differences are, Gianna showed us the difference between the brain and the automatic translation

EG use Dear Sir/Madame in UK but in French just write Madame.

Language builds trust.

It’s all about decisions. A human proof translations, know when to omit something or add a text. They would know the set of conventions.

The left had side of the brain is good for language. The broca and the wenicke.

Concepts and connotations matter – eg using the male or female version of the wall. Like in Spain it is masculine so describe the bridge with more masculine words.

Eg You got nice pants, the term is underwear in UK.

Translators brain plays with all these concepts and you cannot have this with AI.

Possible issues with using AI

Gianna

What options do you have?

  1. Automated translation – but don’t do this unsupervised.
  2. Purely human translation – but it is expensive and takes a lot of time and setting up a process.
  • NLG based on structured product information = natural language generation   AI needs to be trained, product data has to be very structured. Not available in a lot of languages
  • Human text creation including SEO keyword lists cannot be translated, time consuming process set up and briefing are demaning.
  • Pragmatic mix of automated and human translation

Do not forget the important the elements like the shopping cart area and terms and conditions

  • Identify most important areas of the site, eg terms and conditions
  • Identify most important conversion pages
  • Human translation of the point 1 and 2. AI assisted text creation for the rest
  • Iterative identification and subsequent human translation of next important content.

Understand the math

  • Standard page is 1,800 characters. This about 250 words.
  • Assuming translator translates one page per hour.
  • Add time for comm so 10 – 20 % of the time.
  • Prices range from 9pm to 30p per word.
  • So would mean that 250 words per page at 14 p per word, so 308 GBP per day.
  • Investment is a decision.

Treat your store front with care.

Raising JavaScript: Getting Ready for the Real World of SEO

Bartosz Goralewicz gave us a great presentation about JavaScript and he went into detail about one site, Hulu.com and how did they did not fix the basic issues with their website. When you turn off JavaScript you cannot see any of their content. This is really bad user experience.

What if you had a Framework website?

  • Angular.io – angular is created by Google but they cannot see sites made in this.
  • Agular.io/guide/security (3 deep, cannot see these urls in Google) can only see one folder.
  • Top publishers
  • Look at USA today. – The page is not mobile friendly. USA today they have not moved to mobile first.
  • How about websites that specialize in pre rendering?
  • Prerender.ip is only partial indexed in Google. They only have indexed H1 and H2. Their content is not indexed. Their job is to pre render for google so you would expect them to do it well.
  • Delivery.com – you cannot find their content in the Google. Only can find home page. They are not mobile friendly.
  • Do the mobile friendly test , Google have this test you can try.
  • World’s biggest platform – Aliexpress.com – they lost so much visibility when moving to javascript.
  • Alibaba.com saw less visibility.
  • Indexing javascript is a problem in many niches.

So how is Google handling Client Side Javascript?

Google Flights – if the content is from the site (google flights) there is not indexed in Google.

The real cost of Java Script

  • This is something most people do not understand.
  • HTML is the fully baked cake. JS has to be processed on the way and it takes a lot of power.
  • HTML is like the Toyota Prius and Hummer is the Javascript.
  • You would like that websites have less JS for mobile. But not the case. The average amount of JS is almost 400KB.
  • USA Today case Study.  If you go to Chrome Dev tools, then takes 11 secs to load. And top CPU
  • But the EU version is smaller.
  • JS needs a lot of computing power.
  • Accuweather.com = 6 secs to first meaninmgul paint.
  • Motoroloaa G4, 19 secs to load.
  • iPhone X CPU is more powerful than the basic macbooks.

Why is JavaScript so expensive?

  • JS framworks grew a lot over the next few years.Base camp created their own framework..
  • Most popular are these:
  • Angular,  Vue ad React (backed up by Facebook).
  • There are others that are popular (did not take them down).

JS frameworks for SEO

  • JS – there are dozens of different frameworks.
  • Lots of things that affect indexing too
  • Depending on v small things, Google is acting differently.

A Look Behind the Curtain

When upload a page, it is indexed. BUT with JS, the second wave of indexing as Bartosz explained below:

Bartosz

A quick guide to JS rendering.

  • There is client side rendering and there is also server side rendering.
  • For server side rendering need to pick the right frameworks.
  • There is also hybrid rendering  as well as dynamic rendering as Bartosz showed us below:

Dynamic Rendering

Google says you can use some of the tools like puppeteer. Is pre-rendering a silver bullet?

There are issues with Pre-rendering/dynamic rendering issues.

  • Computing power – a lot of servers
  • Prone to issues (often load related)
  • Downtime = lose rankings
  • Requires kits if SEO knowledge
  • Requires a great dev team to make it run smoothly.
  • More complete and difficult from an SEO Perspective (Crawler needs 2 sets of code).

How about JS hacks?

  • Myntra – look at this.
  • Switch off JS, the content is gone.
  • Look at rankings, rank at everything in India.
  • But if disable JS, lots of shit content and links at the bottom.

Trouble shooting JS Indexing

  • Google Search Console – go fetch as Google. – Fetch and render.
  • Do the mobile Friendly Test
  • See how your website compares to what you see in chrome.
  • Use Diffchecker (how has Google interpreted your code)
  • What is partial indexing? You can have a website that is partially indexed. 2 waves. First one is the HTML content then the second one is JS content only.
  • Google is using Chrome 41 but this is a v old version. Latest version is 68

Home work

  • Use the Diff Checker
  • Crawling – server log – look at your server logs.
  • Compare your set with Chrome 41
  • Make sure you content is crawled.
  • Excited about JS – go to ele.ph
  • Node.js developers group
  • Right now Google isn’t perfect with client rendered single page apps.
  • They should crawl it again.
  • But they wont crawl the JS as efficiently as HTML pages.

Advanced auditing for SEO – and respective actions based on those findings.

Bastian Grimm took us through the some of the tools he used to audit sites.  He said it was very important to audit your sites regularly to check the following:

  • Redirects – identify any kind of wrong redirect, eg 302/304/307
  • Crawl errors – too many 4xx are a sign of poo site health
  • Googlebot can’t login (403 forbidden)
  • 5xx server errors: usually infrastructure related – watch closely and /or talk to IT
  • Understanding top/worst crawled URLs and folders – highly crawled pages/folders could eg be used for additional internal linking (add link hubs), low crawled areas need to be linked more prominently.
  • Understand if (new) URLs have been crawled at all – if relevant URLs haven’t been discovered/crawled at all, your internal linking is probably too weak. Consider XM sitempas, better/more prominent linking etc
  • Crawl waste. This depends on the size of the domain. Small domains probably don’t need to worry about this.

Bastian talking

 

There are many areas of the site you should be aware of and this will come out in the crawl:

  • URL parameters cause most problems – eg loggly is super powerful using reg-expressions.
  • URL parameter/s behavior changes over time (check this with Bastian)
  • Monitor less rankings, eg relevant file types esp for parameters, consolidate/remove whenever poss
  • Find no index’ed but frequently crawled pages – they’ve been actively set to noindex thus you want them to be crawled rarely.
  • Start with crawl source gap analysis – eg using DeepCrawl – understanding difference in sources can be a great help.
  • Drill down/filter for URL level info – eg this product URL was crawled frequently but has accidentally been made non indexable, overlaying crawl directives vs actual crawl behavior.
  • Pages that haven’t been crawled but exist in XML sitemaps
  • Or indeaxable pages that don’t want.

Combining Data

  • Crawl frequency by content type over time
  • Audit errors by different user agents
  • Identify long time uncrawled URLs – start improving those pages that are actually important.
  • Identify unknown large pages/files.
  • Involved in migration work? Set up live checks for errors.
  • Post migration performance monitoring
  • User different time frames to see changes even more easily.
  • Combine with external link data: Link hubs vs crawl.
  • Log file auditing is not a project, but is a process.

Bastian showed us some log file management tools:

Log File Tools

 

Structured Data Explained

Fili Wiese talked about structured data in great detail. Fili said that with structured data, there is no guarantee Google will use the structured data or guaranteed ranking boost, but good chance to have higher CTR. It helps Google understand the website and content and give it content and ranking it better accordingly.

We used to have data vocabulary and most of the scheme based on that and now it is schema.org.

What are the schemas we should implement?

  • Local business schema, contact
  • Structured data explained – when applicable.
  • Articles for example. Call it an article.
  • Event, recipes, job postings, videos. Factcheck, top places, music.
  • Some of these are not fully embraced yet, still being developed.
  • Carousel was known as rich cards.
  • What can we ignore? Webpage – no one uses this except wordpress.
  • Be aware of this the data vocabulary of breadcrumbs, there was a bug in it, but they worked to fix it.

How to implement?

  • Google Tag manager (GTM) to inject within the website. Not a long term solution. Every crawl do, Google may say not counting javascript crawl.
  • Tag manager is good for testing, but not long term solution. Same thing for data highlighter. (People do not use Data highlighter outside of Google).
  • Put the schema in the code in line. Not in GTM.
  • But if change one line in the HTMl then could lead to errors.
  • JSONLD – have to put the schema in JasonLD and in the html.
  • Google does not want you to abuse your position in SERPs , there is a Google Search Quality Team that makes sure people do not violate webmaster guidelines.
  • Need to have something unique to sell, updated information, don’t hidden the marked up data. It has to be a true representation of the main content.

Very easy to mess it up. Do not block it in robots.txt.

  • Do not forget the recommended fields
  • What may be recommended today, may be a field in the future.
  • DO not forget the required fields

There is a structured testing tool from Google. Can give previews of how it can be tested.

Conflicting signals is one of the areas where we as SEOs come in.

  • Website schema should be implemented once on the home page. None of the tools tell you this, it only says it is correct or not.
  • Also Google advises against using both micro data and JSON-LD types on a single page.
  • We also want to make sure that we do not mix match with our sitemaps.
  • Need to make sure we use on mobile/deskop parity
  • If you do things wrong, you will get a penalty.

Eg if you have a job posting and you do not remove it when the job has been filled.

Beyond Web Pages

  • Structured data goes beyond this
  • Speakable – extended this to schema
  • One of the other places is email

Take aways

  • Help Google better understand your content
  • Use JSCONLD
  • Audit your site regularly (at least once a year)

The Giant Speed Benchmark: Your Emotional Support Dog

Nichola Stott started the presentation by saying that they believe stress caused by mobile delays was comparable to watching a horror movie.

For every second of improvement of site speed Walmart saw a 2% increase in conversion rates.

How did we do it? What did we learn from the Giant Speed Benchmark and how can we use it?

  • Started last year as an R and D website. Made our own website, Progressive web app. This is what Erudite did.
  • From there can send push notificaitons
  • App Manifest
  • Service worker registeration (man in the middle)
  • Satisfy speed criteria (most fundamental is a sub 10 sec TTFI)
  • Use https
  • So Erudite audited the top 100 websites and how many PWAs were there.
  • Audited using Lighthouse.

(5 light house PWA audit criteria)

Nichola Stott PWA

  • Did this all manually, took some time but did it. They went through 1000 websites.
  • They find out that Httparchive has lighthouse results for all this.
  • Now Nichola had over 5,000 website with 7 distinct performance related comparable data.
  • HTTP archive – they are killing it. They are pulling on som data sources. Three days ago, angular took over xx as JS of choice.
  • Webpage test – it is a dev facing presentation.
  • Lighhouse isn’t like that, it is between marketer and developer. Lighthouse ooks different. It is someone you can give to a client and they will not cry.
  • It is also v user facing. The metrics matter in terms of what customers want to do.

So what did Euridte do?

  1. Took the UK 5000 URls, cleaned it up
  2. Classified them all into categories. Eg health , retial, travel and leaisure

What did we learn?

  • When Nichola collected and analysed the data, it was Nov 2017. First metric looked at, other and and news and media was the fastest, less than 2 seconds.
  • Most popular consumer websites, offer the slowest mobile experience.
  • Retail channel:
  • renderStart is FVP. Full content pain.
  • FMP – more meaningum,
  • CI – consistently interactive. (look at the metrics photo)

FCP – looked at median and the top quartiles and bottom quartiles

FCP this is the persuasion factor. Median, most of the sites are under 3 seconds. When comes with meaningful paint, so most of the page having color on it.

Consistently interactive, can the user do stuff. It took nearly 10 secs. It was sites like NEXT. So quite slow. Not good experience.

Fully loaded

This can be an ad tracking script or the cookie drop – bottom quarltile – taking over 166 secs to load.

Retail – Speed index

For the 1200 retail sites, med speed was 4.8.

How can we use it?

  • Set goals that are smart.
  • Eg ensure that 3g CI is below 7 secs when audited via Lighthouse by end of Q1.
  • Get budget and talent. – if Nichola is needing SEOs and the site is very slow, she can get more SEOs to help her.
  • Evidence competitor performanece
  • Resource and project prioritisation

What a fantastic day at Search Elite. Following on from the 9 talks, we had a panel led by Jono Alderson with Gerry White, Tim Stewart, Russell McAthy and Bart Schutz.

Leave a Reply

Your email address will not be published. Required fields are marked *