
- Blog Posts


- Blog Posts

- Blog Posts
All Categories
SEO
SEA
Social Media
Web Analytics
Growth Marketing
Artificial Intelligence
Success Stories
Inside internetwarriors
Search engine optimization with Bing SEO
Oct 16, 2019

Thorsten
Abrahamczik
Category:
SEO

As a website operator, you want your site to be easily found. You often think of Google, but not of the search engine Bing, right? For many website operators, search engine optimization is an abstract topic, and Bing SEO often gets pushed into the background. This is probably because Germany is a Google country. According to Statcounter, Google had a market share of 94.5% in Germany in 2019, while Bing only had 3%. [caption id="attachment_25213" align="aligncenter" width="1024"] Fig. 1 Statcounter statistics on search engine usage in Germany in 2019[/caption] What features does Bing offer? With Bing's previously unheard-of preview function in search engines, users receive useful information about content and internal links even before they enter the website. Here, the displayed content is actually drawn from the page content and not, as with a snippet, from a meta-description. [caption id="attachment_25221" align="aligncenter" width="1024"] Fig. 1: Statcounter statistics on search engine usage in Germany in 2019[/caption] Still, you shouldn't ignore the Bing search engine. Users who come via Bing often have better user behavior than those from Google. They typically view more pages, stay longer on the domain, and have a higher conversion rate. It may therefore be worthwhile for you as a provider to engage this audience. Also, the results of Apple’s Siri service and Amazon’s Alexa are based on Bing. [caption id="attachment_25223" align="aligncenter" width="1024"] Fig. 2: Comparison of user behavior on Bing with Google[/caption] Commonalities between Bing and Google In principle, you are already optimizing your website for Bing if you carry out search engine optimization according to Google guidelines. This is partly because Bing uses similar criteria. It is also because some SEO standards from Bing, Google, and other search engines were created in alliance together. An example of this is the structured data from schema.org . Further aids for Bing search engine optimization A look at Bing's search results page is also very helpful. Bing specifies much more precisely where the data comes from. In addition to pure search engine optimization, revising other profiles on the Internet such as Xing, Wikipedia, etc., can also lead to better visibility on Bing. The optimization of multimedia elements on the website can also show better effects than on Google. This is visible with Bing image search or video results. By strategically using structured data and consistent use of alt titles, you can position images and videos very well in the search engine. Furthermore, Bing search offers some options that are not available in Google. For example, individual elements in images can be selected for search. Videos are supplemented in the results with a lot of information, such as whether it is a trailer. Also, videos can already be watched in preview, and Bing considers the search query history to display the most suitable results. [caption id="attachment_25225" align="aligncenter" width="1024"] Fig. 3: Identification of elements in Bing image search[/caption] What to pay attention to Microsoft offers various guidelines and tools for its search engine to support users in achieving the best possible search engine optimization. The following three are the most relevant from our perspective: Bing Webmaster Tools Submit sitemap.xml Block URLs Monitor reports Control crawls Test mobile-friendliness We warmly recommend using the Bing Webmaster Tools to gain better control options. Of course, the domain can also rank in the Bing search engine even if it has not been registered there. Nonetheless, it is interesting to see how Bing assesses the website compared to Google. Bing Webmaster Guidelines Bing offers the very informative Bing Webmaster Guidelines on its websites. These provide users with initial help on the topic of search engine optimization. The instructions are thematically subdivided and clearly presented. They are helpful for both experienced users and newcomers to the topic of Bing SEO who want to know what the search engine operator values. Markup Validator If you use structured data on your website, you can check the correct implementation with the Bing Markup Validator . Supported formats are: HTML Microdata Microformats RDFa schema.org OpenGraph from Facebook Please note, however, that you must register to use the tool with the Bing Webmaster Tools. This also applies to the SEO Analyzer and the keyword research tool. Peculiarities of the Bing search engine We generally want to give you three tips that will help you with Bing SEO: 1. Exact Match URLs: Bing places much more emphasis on the correct name in the domain than Google does. Therefore, if you own a domain that includes your brand name, it will automatically rank better on Bing. 2. JavaScript Currently, Bing is not as advanced as Google in accounting for JavaScript on websites (as of 10.2019). This means that while your site may rank well on Google, it might be placed lower on Bing. Sites that dynamically load content through JavaScript are particularly affected. However, if your site uses minimal JavaScript or does not load content dynamically, you can overlook this point. 3. Google Adjustments Google frequently makes adjustments to its SEO guidelines, suddenly weighting certain features more or less. The last known adjustment was the removal of "noindex" from the robots.txt file. It's important to note with these changes that they apply only to Google and not automatically to Bing. So don't be alarmed by them. Conclusion The Bing search engine offers many unique features to differentiate itself from its bigger rival, Google. At the same time, its users exhibit consistently better behavior on websites, regardless of the industry. To rank well here, you only need to make a few specific adjustments, but you can use Bing's excellent tools to closely monitor your developments. How can we support you? Do you want to increase your conversions and revenue through organic search? Are you already ranking well on Google and now want to kick off with Bing? We support you in Bing SEO and thus achieve higher rankings in the Bing search engine. Contact us here , we look forward to your inquiry!
On-page SEO Strategies for Better Search Engine Optimization
Jun 18, 2019

Thorsten
Abrahamczik
Category:
SEO

SEO is a commonly used term among online marketing managers and often comes in conjunction with off-page optimization . Both terms describe the search engine optimization of your own website . The aim of targeted SEO optimization is to improve the ranking of your own website in search engines, generate more traffic, and ultimately lead to more conversions or leads. However, marketing managers usually only know a portion of the actions that can be performed. In this article, we explain what on-page SEO is all about and offer initial tips for practical implementation . On the internet, you will come across various spellings, but it always means the same thing. Possible terms are: Onpage SEO, Onpage optimization, On-page optimization, On Page optimization, SEO on-page, SEO optimization, or even on-site optimization. On-page Analysis – The first step in On-page SEO At the beginning of your work, you should perform an onsite analysis of the entire website. First, you analyze the technical SEO to identify possible technical pitfalls that could affect the crawling of the website. In the next step, you examine the contents of the website in order to carry out content optimization. After investigating both areas, you can derive a prioritized action plan that you implement step by step. To track the development of your website in SEO, you can use Google's Google Search Console. This serves as an interface between website operators and Google, showing which URLs rank well, for which keywords the domain can be found, and where there are problems on the website. It also visually represents how often individual pages are displayed on Google's search result pages. Technical SEO – The Foundation of On-page Optimization Before you can start optimizing your content, your technical SEO must be implemented flawlessly. This ensures that the website can be crawled easily by search engines and that all content is read and processed accurately . If technical SEO is not implemented correctly, it may happen that content optimizations, the so-called content optimization, do not take effect properly because the search engines do not have access to the content. Therefore, technical SEO is the foundation of every SEO on-page optimization. Crawling – Can the search engine reach your website? Crawling solely refers to accessing content and has nothing to do with indexing. Search engines like Google certainly crawl pages, but they do not index all of them. This can occur for various reasons, including poor search engine optimization. To improve the relevance of content, only the content that needs to be processed by search engines should be made crawlable. Pages that are not important, such as search results pages or the imprint, should be excluded from crawling. This is referred to as optimizing the so-called crawl budget. It determines how many pages of a website can be crawled and is set individually by search engines for each domain. In this context, there is an important distinction: Crawling is not equivalent to indexing! To control crawling, use the meta tag "robots" and the "robots.txt" file . However, it should be noted that the instructions contained in the "robots.txt" are merely a suggestion to search engines and can be ignored entirely. Furthermore, search engines can also reach excluded pages through other ways, such as backlinks. Therefore, using the "Disallow" function is not a reliable method to exclude pages from crawling. On the contrary, incorrect use of the robots.txt can lead to major problems on search results pages, as Figure 1 illustrates. Figure 1: Websites indexed by Google for which no descriptions can be displayed because they are excluded from crawling in the robots.txt. In May 2019, Google updated its in-house Google Crawler to the current Chromium version 74. This is an important note because its predecessor was outdated and supported only few modern web technologies. The new crawler can now recognize modern SEO optimization and perform better on-page analyses. SUBSCRIBE TO NEWSLETTER NOW URL Structure, Internal Linking, and sitemap.xml are important SEO On-page Factors Next, you need to review the URL structure of your website. Key questions are: Is it readable and clearly understandable for humans? Can users tell where they are on the website? Is it not too long? The URL structure should not exceed 5 hierarchy levels . This ties in with the structure of the website. Today's on-page SEO is not about optimization for search engines, but for users. This means that content must be easily and quickly accessible . Excessive cascading of individual webpages is therefore not advisable, as it only leads to more hierarchies. John Mueller, Webmaster Trends Analyst at Google, announced on 03/05/2019 via a Webmaster Hangout that the internal linking of pages should be weighted even more than the URL structure. Visible URLs are primarily relevant for user experience. Internal linking of pages should instead be done in a way that is relevant to the topic, to achieve good on-page optimization. In this context, it is also referred to as siloing. For good internal linking, you must ensure that links are made only within a topic (silo). For example, all articles on web analytics should link to each other, but not to articles on SEO. Therefore, you should always develop a linking concept and check, especially with links that have JavaScript functionality, whether crawlers can find and follow the links. If you have similar content, you should use the canonical tag. Imagine you run an online shop and offer a t-shirt in five colors. You provide a separate page with a unique URL for each of these color variants. In this case, you would have duplicate content on the domain because all pages would be textually identical and would only differ through the color specification. With the canonical tag, you can indicate on all pages which of the five URLs is actually relevant to search engines and which serve solely as added value for the user. Additionally, you should provide search engines with a sitemap.xml file , a file listing all the URLs that are to be indexed. Therefore, this file should not include URLs set to "noindex" or those excluded from crawling or indexing in any other way. Avoid Duplicate Content with SEO On-page The topic of duplicate content or duplicate content is one of the main focuses of SEO on-page optimization. Often, the same pages are accessible under multiple URLs. Classic examples are: URL with http and https URL with www and without www URL with trailing slash and without URL with very similar content For this reason, it is essential to set up redirects and communicate very precisely to search engines via canonical tags and sitemap.xml which websites should actually be indexed. Imagine you run an online shop and offer a t-shirt in five colors. You create a separate page with a unique URL for each of these color variants. In this case, you have duplicate content on the domain because all pages are textually identical and differ only in color indication. With the canonical tag, you can indicate on all pages which of the five URLs is actually relevant for search engines and which pages serve solely as added value for the user. During a redirect, users calling a URL with http are automatically redirected to the https variant. This happens so quickly that they often do not even notice it. Pagespeed – How Fast Does Your Website Load? For several years now, Google has recorded more accesses via mobile devices than via desktop devices. It is therefore important to offer a fast-loading website . This can be implemented in several ways: Ensure small file sizes and short source codes. This optimizes the website as a whole and ensures a good user experience. Optimize visible content through prioritized delivery of source code . With this measure, users see the first content in the visible area long before the entire website is loaded. This improves the perceived load time. Switch the Hypertext Transfer Protocol (http) to version 2 (http/2). This is optimized for mobile devices and allows parallel loading of various files, as well as preloading of content. We also recommend using https for good on-page SEO. Figure 2 illustrates the Google Page Speed Test, which examines exactly these topics: Figure 2: With the Google Page Speed Test, you can see which files are too large and which files delay optimized delivery of visible content. Mobile Optimization is Becoming Increasingly Important In addition to the load time, the display of the website on mobile devices must also be ensured. This is often achieved with responsive design, which ensures that the website adapts automatically to the screen size of mobile devices. If you are planning a relaunch soon, this issue must be addressed in your concept. Structured Data for Good Onsite Marketing With structured data, individual content is specifically tagged for search engines. This may include company information, product information, recipes, events, or, more recently, FAQs. There are many templates for using structured data on a website, although only relatively few are supported in Google optimization. The benefit is that Google can better understand the content and display it separately on the search results page. Further SEO On-page Optimizations The technical SEO area also includes further measures such as Progressive Web Apps, image tagging, or multilingualism. These are the "fine-tuning" aspects in the field of technical SEO optimization. Therefore, these detailed topics are not elaborated here. Content Optimization – The Second Step in On-page SEO Once the website has been improved with technical on-page SEO measures, you can now start optimizing your content. This is basically divided into two areas: Meta Information: Optimizing Page Title and Meta Description The optimization of the page title and meta description is referred to in on-page SEO as the optimization of meta information. The page title as an SEO criterion is particularly crucial as it should contain the keyword and the brand. At the same time, it should not be too long to remain readable. You always see the page title in the tab of your browser as well as on Google's search results page. The meta description itself is not SEO relevant but still has a significant indirect impact on onsite marketing. The meta description is displayed on search results pages of search engines if relevant. By directly addressing users with a call to action, you can improve the click-through rate on your results and thus achieve a better ranking. Figure 3: Google search results with the page title in blue and the meta description in black. Content on the Page Itself – What Can the User Expect? The actual content must meet user expectations, otherwise, it will lead to high bounce rates. Remember that content should be created for users in the context of on-page optimization, not for search engines. For this reason, Google places great emphasis on how readable the content is. Furthermore, the texts must be well-structured and organized with headings. Increasing user interaction with the website is also desirable. This can be achieved through videos, images, image galleries, comments, or similar features. It is proven that longer dwell times are associated with an increase in conversions. If the content is well created, Google may use it as a Featured Snippet. This is the position 0 on the search results page, where Google directly answers the user's question on the search results page. Marketers have no influence over the use of a Featured Snippet and cannot predict when a snippet will be displayed and when not. Getting your own content into a Featured Snippet is therefore considered the pinnacle of Google optimization for content. Figure 4: Representation of a Featured Snippet on Google's search results page. Google Jobs – A Brand New Feature With Google Jobs, the search engine giant introduced a brand new feature in Germany in June 2019, which will lead to many on-page SEO optimizations in 2019. When users search for a job title, they are shown a list of available job offers. After clicking, they are directly redirected to the company's page, where they can apply for the job in the next step. However, to do this, website operators must use structured data and provide very specific content on the website. What We Can Do for You If you want to improve your positions in search engine rankings and thereby increase your number of conversions, we offer comprehensive support in the area of on-page optimization. Our on-page SEO measures are coordinated with other online marketing measures. Feel free to contact us, we look forward to your inquiry.
Opt-In, Initial Insights from Practice
Jun 28, 2018

Thorsten
Abrahamczik
Category:
Web Analytics

Opt-In – Impact on Online Marketing through the EU Cookie Directive Under the new General Data Protection Regulation (GDPR), many marketers have experienced significant confusion regarding the EU Cookie Directive and the Opt-In and Opt-Out procedures. Additionally, there is uncertainty about the e-Privacy regulation, which is expected to become mandatory in 2019. In our article "No Google Analytics without Google Analytics Opt-Out Cookie" , we have already discussed the necessity of a Google Analytics Opt-Out Cookie on the privacy page. In this article, we want to explain the Opt-In procedure, which requires the explicit consent of the user for analysis and marketing measures. We will also illustrate how this procedure affects all online marketing channels and activities. What is the Opt-In procedure? The Opt-In procedure is based on the increasingly popular cookie notice, which mentions the use of cookies on websites. As shown in images 1 and 2, the formulations were revised on May 25, 2018, and supplemented with additional information on the use of cookies in many cases. Fig. 1 Old cookie notice as it was used on https://www.internetwarriors.de before GDPR. Fig.2 Current notice, allowing users to exclude themselves from tracking. Since this revision, many users have had the opportunity to agree to or decline the use of cookies on individual websites. Once users make a decision here, the use of cookies, aside from specific exceptions like session cookies, must be respected across the entire domain. However, many lawyers and data protection officers interpret the GDPR differently, resulting in users being offered various solutions. These range from simple cookie banners without selection options to Opt-In procedures. The impacts of the Opt-Out procedure are already known. However, with the Opt-In procedure, only a very few companies have experience. For this reason, we tested the Opt-In procedure within the framework of GDPR cookies to gather initial insights that we can consider in future implementations. Distinction between Opt-In and Double Opt-In Before we begin with the implementation and the impact on traffic, we need to differentiate between Opt-In and Double Opt-In: Opt-In: An information banner is displayed to the user on accessing the website, informing them about the use of cookies and, if necessary, their purposes. The user must also explicitly consent to the use of cookies before web analysis and marketing measures may be carried out. If they do not, neither tool may be used. Double Opt-In: This procedure is primarily used in email marketing. Upon subscribing to a newsletter, the user receives a confirmation email, requiring them to actively confirm their subscription. As you can see, both procedures are independent of each other and have nothing in common. Changes in traffic due to the implementation of Opt-In As part of our Opt-In investigation, we examined the traffic development on 10 websites in Google Analytics before and after implementing the Opt-In. The Google Analytics screenshot in image 3 shows the number of sessions of a website on a daily basis, before and after the implementation of Opt-In. Fig. 3 Traffic development from May 24, 2018, to June 21, 2018. Opt-In was implemented on June 8, 2018. Comparing the period after implementation with the period before implementation and excluding the day of implementation, the following traffic changes arise: Fig. 4 Comparison of developments in Google Analytics in the periods June 9, 2018 – June 15, 2018, and June 1, 2018 – June 7, 2018 Other websites with about 5,000 sessions per day even show deviations of 83% - 85%. Only a few websites have a smaller deviation than shown in the screenshots here. Configuring Opt-In with a Step-by-Step Guide To help you understand how the entire procedure works, we would like to give you a detailed Step-by-Step explanation. Additionally, at the end of the article, we offer you the chance to download our configuration of the Google Tag Manager container so that you can import it into your Google Tag Manager and gain experience with the implementation. GDPR Cookie Notice on the Website An essential requirement is a cookie banner on your website, informing users about the use of cookies and giving them the option to activate or leave analysis and advertisement cookies deactivated. For simplicity's sake, we use the popular solution Consent by Insites for our attempt. We have also integrated this script on our website. Via the Download menu, you can configure a banner that you only need to copy into the source code of your website afterward. During this process, you have to decide whether you want to use the Opt-In or Opt-Out procedure. In our current scenario, we use the cookie notice for Opt-In. Subsequently, the banner is displayed immediately. Thus, implementation is very easy to carry out even for less technically skilled individuals. Storing User Decision in a Cookie Once you've embedded the banner on your website, it will be displayed to all users. However, initially, nothing else happens, no cookies are blocked yet. Insites itself uses a cookie named "cookieconsent_status" to store the user's decision and not display the banner again on their next visit. This decision is valid for one year. The cookie values can be seen in image 5: "allow" for consent "dismiss" for rejection You also get the expiration date of the cookie, from which the browser no longer considers the cookie. We can read the "allow" and "dismiss" values with the Google Tag Manager and take them into account for triggering Google Analytics, Google AdWords, Affiliate, etc. The decision of an Opt-In should not be limited to web analytics using Google Analytics, Etracker, Webtrekk, etc. alone. All remarketing and conversion tracking from other providers should also be considered. Fig. 5: Status of the "cookieconsent_status" cookie for Opt-In Consideration of Do Not Track After considering the GDPR cookie decision by the user, we want to consider a second option of rejecting analysis and marketing cookies. This involves the Do Not Track procedure . In this case, the browser sends information to the server with each new page view that no user profile should be created and personal activities should not be tracked. Image 6 shows the setting in Firefox's "Privacy & Security" section. Fig. 6: Activation of the "Do Not Track" information in Firefox's privacy settings Do Not Track is integrated into all relevant browsers like Google Chrome, Mozilla Firefox, Apple Safari, etc., but is disabled by default. Therefore, the user must make a conscious decision and manually enable Do Not Track. If they do, website operators should respect this decision if they offer the Opt-In procedure. Interaction of individual configurations in the Google Tag Manager To configure Opt-In in the Google Tag Manager and consider GDPR relevant cookies, we have defined the following rules: Has the user explicitly agreed to the use of cookies for Opt-In? If yes, we check whether the user has activated Do Not Track If no, we keep all tracking disabled Has the user activated Do Not Track If yes, we keep all tracking disabled. This rule also overrides the previous rule if the user has agreed to tracking on the banner If no, we check whether the user has consented to the use of cookies. Only if both conditions are fulfilled will analysis and marketing cookies be activated The user has consented to the use of cookies The user has deactivated Do Not Track If even one value deviates, the cookies remain blocked. This way, the website operator offers maximum protection for users from cookie capture. At the beginning of the article, we showed that this setting in the Google Tag Manager resulted in significant traffic loss in Google Analytics. But since all remarketing and conversion tracking is also blocked, website operators can no longer tag their users and can measure success significantly less. Measuring Do Not Track Usage on Another Website Currently, according to our non-legally binding understanding, there is no obligation to use Opt-In tracking. However, this may change with the e-Privacy regulation in 2019. Regardless, it is not known to us that the Do Not Track feature is a mandatory measure for website operators. For this reason, we analyzed the use of Do Not Track on an eleventh site. This site serves family entertainment and is characterized by a high national as well as international traffic. It also serves both genders and age groups from infants to great-grandparents. We consider these numbers a good cross-section of society. In image 7, we have juxtaposed the number of sessions and accesses with activated Do Not Track. For measuring activated Do Not Track, we use "Unique Events" in Google Analytics, as this value is "session-based" and thus provides a comparable data basis. Fig. 7: At 10% of all sessions, Do Not Track is activated in the period June 13, 2018 – June 20, 2018 The collection period is June 13, 2018 – June 20, 2018. It is clearly visible that in 10% of all sessions, Do Not Track is activated. Here, users have made a very conscious decision not to be tracked. Learnings from the Test These are very valuable and important insights for us. The Opt-In procedure significantly reduces the metrics in the analysis and marketing tools and makes it considerably more difficult to capture users. If the use of Opt-In becomes mandatory, other methods would need to be developed to continue offering online marketing in the same quality. If you, as a website operator, need to decide between Opt-In or Opt-Out, you now know the pros and cons. We are also happy to offer our Google Tag Manager container configuration for download. Fill out the following form, and we will send you the download link by email. The .zip file can be easily opened, and you will find a .json file inside. When you are in your Google Tag Manager, click on "Admin" and then on "Import Container." Subsequently, select the .json file and import it into your Google Tag Manager container. You will then find all the templates we created. It will be exciting to see how the e-Privacy regulation impacts the EU Cookie Directive and how users take advantage of Opt-In options. What can we do for you? If you are unsure whether you need Opt-In tracking or if you experience difficulties implementing the Opt-In or Opt-Out procedures, we are here to help you. We support you in the implementation of your online marketing strategies and can quickly make technical adjustments to your website, should the new e-Privacy regulation require it.
Early configuration of Google Tag Manager with the Tag Manager Injector
Aug 31, 2017

Thorsten
Abrahamczik
Category:
Web Analytics

The Tag Manager Injector, as a Chrome browser plugin, allows for easy setup of the Google Tag Manager within Google Chrome. Without long waiting times, we can start servicing our clients and are not forced to wait. Meanwhile, the client's IT can take its time integrating the Tag Manager code into the website's source code. At the beginning of our collaboration, we work closely with our clients to create tailored tracking concepts. At this point, it is often not yet clear what exactly needs to be tracked and which metrics provide value to the client. Once the tracking concept is finalized and approved by the client, we begin implementing it. First, the Google Tag Manager code needs to be embedded into the site's source code. However, this can typically take several days, as the client's IT may not be able to implement it immediately. To start configuring in the meantime, we use the Chrome browser plugin Tag Manager Injector. This makes us much more independent from the client's IT and allows us to work faster. Requirements for Using the Tag Manager Injector To use the Tag Manager Injector, a Google Tag Manager account must first be created. Within this account, a container must be created. The Google Tag Manager then provides the code for the container, which the IT must integrate. At this point, we can already start using the Tag Manager Injector to configure the newly created container. In this context, the unique container ID, which identifies this particular container, is important. Fig. 1: The container created in Google Tag Manager with the container ID Using the Tag Manager Injector After this process is completed, you go into the newly created container and create the first tags, triggers, and variables. At this point, the container can already be configured as it is intended to be used later. There are no limitations here, as third-party tags and scripts can be used. After setting up the initial tags, Google Tag Manager's preview mode should be activated. This way, you can check whether the tags have been configured correctly. The next step is to install the plugin. Once the plugin is active, it can be used. To do this, click on the plugin itself and you will see a simple input screen where you enter the container ID of the previously created container. Next, under the point “Include Domain(s)”, enter the domain where the Google Tag Manager container should be used. Then, just click the “Start” button. Once the Tag Manager Injector is active, the area at the GTM container ID turns green. Fig. 2: The configured and usable Tag Manager Injector By activating preview mode in the container, the regular Google Tag Manager preview window opens on the website in the browser. It's now easy to see which tags fire, what variable values are displayed, and what the data layer looks like. If you want to push specific values to the data layer, the Tag Manager Injector offers a clearly visible input field “Push to the data Layer”. You just need to enter the respective information there, and the data will be transmitted. Fig. 3: The Google Tag Manager preview window in the browser In the real-time report of Google Analytics, initial accesses can now be seen, showing that Google Analytics tracking is working. How Can We Help You? Would you like to implement web analytics on your page according to a defined tracking concept but are unsure how to do this exactly? Do you want to measure and increase conversions more effectively but have problems implementing additional tracking? Contact us and we will gladly help you improve your web analytics.
Record entries of domain management and what you need to consider when making adjustments
Jun 22, 2017

Thorsten
Abrahamczik
Category:
SEO

As a website operator, you must also deal with the management of your domain. This involves not just purchasing a domain but also configuring it for web servers and email servers. This primarily concerns larger companies or agencies that manage domains on separate servers, which are operated by the actual web or email servers. An example of such a scenario includes providers like NICdirect or 1Blu, where domains are purchased and managed, with a specialization in hosting large enterprises. However, regular web hosts like Mittwald, Webgo, Host Europe, etc., also offer domain configuration to some extent. Fig.1: Record entries for a domain of internetwarriors GmbH The Configuration of the Domain Name System The Domain Name System, often abbreviated as DNS, is one of the most important services on the internet. So-called DNS servers ensure that URL names are converted into IP addresses. When you type a URL into the browser, servers on the internet cannot initially do anything with that because they use IP addresses to identify themselves. An IP address is a unique numerical combination assigned individually to each device on the internet. This is most easily compared to a phone number assigned to every phone line. To know which server has requested a webpage (the URL typed into the browser), the browser first sends a request to a DNS server. This server maintains a large database that stores the IP address of the corresponding server for each domain, similar to a phone book where each name is paired with a phone number. In response to the request, the DNS server sends back the IP address of the corresponding server for the webpage to the browser. The browser can then directly place the request for the webpage with the actual web server. Types of Record Entries For the DNS server to know which IP address is behind a URL, this must be set in the domain configuration. There are various so-called record entries for different services. The most important ones are: NS A AAAA MX Name Server Record - NS The Name Server Record, often referred to as NS, is responsible for name resolution. This means resolving the names of services, e.g., domains, into computer-readable addresses. These addresses are the so-called IP addresses. Each of the entries has a so-called TTL (Time to Live). This determines how long an entry remains valid in the cache before it must be renewed. Typically, this value is 86400 seconds, which means 24 hours. The relocation of a domain to a new web server correspondingly takes 24 hours, as all global DNS servers must first be updated before the correct IP address is delivered to browsers. Address Record - A The Address Record or A Resource Record ensures that an IPv4 address is assigned to an entry on a DNS server. IPv4 addresses are IP addresses based on a four-octet system. This is greatly limited in the number of possible IP addresses and can no longer cover the required IP addresses for all devices connected to the internet, e.g., computers, smartphones, servers, etc. Nevertheless, it is still very common today. Address Record - AAAA The Address Record AAAA essentially provides the same functionality as the Address Record A. However, it is based on the so-called IPv6 addressing system, which is the successor of IPv4. It offers significantly more IP addresses and can therefore cover a much larger number of IP addresses and DNS entries. Mail Exchange Record - MX An MX Resource Record describes under which domain the corresponding email server can be reached. Through this, email programs can send and receive their emails. It is important to note that this entry must always include a fully spelled-out URL. What Can We Do for You? With our many years of experience in web hosting and domain management, we are happy to assist you in managing your website. Please contact us if you do not want to handle the technical management of your website or are planning a relaunch. We would be very pleased to discuss your individual support needs with you.
A/B Testing with Google Optimize
May 11, 2017

Thorsten
Abrahamczik
Category:
Web Analytics

With Google Optimize , the search engine provider has introduced a new tool for conducting experiments on websites. Originally introduced as part of the Google 360 Suite, Google now offers the program, with a few restrictions, as a free version. This makes it easy for all marketers to use the tool for their own experiments. Fig. 1: The homepage of Google Optimize In addition to A/B tests, the program also supports multivariate and redirect tests. The following distinguishes the different types of tests: A/B Testing: This involves testing individual variations of the same webpage. Typically, the variations differ only in small parts, such as a different button color or a new call to action. Multivariate Tests: These tests work similarly to A/B tests, but in this case, several elements of a page are tested to find the best possible combination of elements. At the same time, it allows better investigation of user interaction between individual variations for conversion optimization. This quickly leads to a significantly larger number of variations. Redirect Tests: In these tests, separate pages with their own URLs are tested against each other. This way, different versions of entire pages can be effectively tested. Creating an Account and Container in Google Optimize To get started, users must open the homepage URL of Google Optimize . Upon initial opening, email subscriptions for (Tips & Recommendations, Product Announcements, and Market Research) should be considered, but they can also be declined. In the next step, the user must configure their account settings once. Fig. 2: One-time configuration of the Google Optimize account After making the changes visible in Figure 2, you can immediately begin setting up a website test. For the user, an account and a container are immediately created in which tests can be managed and configured. The setup is identical to the Google Tag Manager, which also relies on an account with individual containers. This significantly facilitates operation. Start with a First Test Before marketers begin creating a test, they must ensure that the website to be tested has many visitors. Only then can valid data be collected. If a website receives only a few visits per month, the evaluation of a test takes much longer to obtain statistically valid tests. In this case, only a few variations should be tested. In addition, marketers should conduct only small tests initially, such as changing a button color or swapping out text. This allows them to learn how to use the tool and understand how to build meaningful and effective tests. More complex tests can be created later. To start a test, the user must create a new experiment. Google offers templates for this, in which the user must select a name for the experiment, a URL for the page to be examined, and the type of test. Figure 3 shows the corresponding screen from Google Optimize. Fig. 3: These details can be used to create an experiment in Google Optimize. Work with Variations Once an experiment is created, the user can create so-called variations. These are slight alterations within the website. Regardless of the number of variations, each variation is always tested against the original version of the website. The marketer can also specify at this point how much traffic should participate in the test and how much traffic each variation should receive. By default, 100% of the traffic participates in an A/B test, and this traffic is evenly distributed across all variations. So, if there is the original version of the website and one variant version, each will receive 50% of the traffic. Fig. 4: For each test, a goal and a hypothesis must be set. After the variations are created, goals and descriptions must be set for the test. Examples of test goals include: Reducing bounce rates Increasing the number of page views Increasing the number of transactions Subsequently, the individual variations must be configured. A visual editor is used for this, allowing marketers to make directly visible changes to the website. For small changes, no knowledge of HTML, CSS, or JavaScript is necessary. For more complex changes involving HTML, CSS, or JavaScript, a general technical understanding of HTML and CSS is certainly required. For JavaScript changes, programmers should be consulted. Fig. 5: Google recommends installing the Google Chrome browser plugin for operating the visual editor of Google Optimize. To work with the visual editor, a browser plugin must first be installed. Google Chrome checks for the plugin and, as seen in Figure 5, suggests installation if necessary. Once the plugin is installed, the user can open the website and make adjustments. Figures 6 - 10 show how users can make adjustments: Fig. 6: By hovering the mouse, users select a webpage element. The individual elements are directly marked and highlighted by Google Optimize. Fig. 7: Changes to the selected element can then be made using the visual editor at the bottom right of the screen. Fig. 8: By clicking on "Edit Element" in the visual editor, further options can be selected, in this case, "Edit text". Fig. 9: Subsequently, the text of the H1 headline can be easily modified. Fig. 10: The bar at the top of the screen shows, among other things, which element the user is in (H1), how many changes have been made, and how each change appears on different device types such as desktop, tablet, and smartphone. Once the desired changes are made and saved, the appearance and behavior of the changes must be checked on each device. Quick errors may occur due to individual programming that can be avoided through extended checks. Linking Google Optimize with Google Analytics In the next step, Google Optimize must be linked with Google Analytics . For this purpose, the user selects a data view within the desired Google Analytics property. The user behavior data of this data view is then used to evaluate the test. This way, changes in bounce rates, the number of transactions, etc., can be considered in the experiment. Integration of Google Optimize via the Google Tag Manager In its developer area for Google Optimize, Google recommends using a modified Google Analytics code. This loads faster and prevents screen flickering caused by the dynamically made changes to the website. As a result, the user does not see upon page load that they are being shown a different variant. However, Google Optimize can also be integrated via the Google Tag Manager. In this case, the Google Tag Manager code should be placed as high up in the source code as possible. This is necessary to avoid possible screen flickering. The execution order of the different codes upon a page load is as follows: The user loads the page The Google Tag Manager code is executed The Google Optimize code is executed The Google Analytics code is executed Due to the use of the Google Tag Manager, there is a delay in execution, as Optimize can only be executed once the Tag Manager is loaded and executed. This is not the case when using a modified Google Analytics code, and the Google Optimize code can be executed immediately. As a result, on pages with many images or resources to load, the mentioned screen flickering can be reduced or completely avoided. In any case, Google Analytics should only be executed after the Google Optimize code, regardless of the use of the Google Tag Manager. A corresponding configuration must be set in the advanced settings of all Google Analytics tags on the website. Figure 11 shows the configuration of a Google Optimize tag in the Google Tag Manager. In this tag, essentially only the property ID of Google Analytics (also known as the UA number) and the container ID of the Google Optimize experiment must be entered. To comply with data protection, the IP address should also be anonymized. The trigger should be configured as precisely as possible to trigger the tag only when the corresponding page is called or the corresponding event is triggered. Fig. 11: Configuration of a Google Optimize tag in the Google Tag Manager. Defining the Target Audience and Timeframe Once the Google Optimize tag is published in the Google Tag Manager or the Google Optimize snippet is embedded in the page's source code, the further configuration of the experiment can proceed. For this, the target audience and duration of the test must be defined. In the free version, only the number of users participating in the test can be selected for the target audience. Currently, granular settings related to the target audience, such as age, gender, source of access, etc., are not possible. The timeframe can be individually set. Alternatively, the test can be started immediately and, if desired, ended immediately. Figure 12 shows the currently possible settings. Fig. 12: : Target Audience settings in Google Optimize. After an experiment, marketers can evaluate the results in the reporting area. Here, the data is displayed both in total and for each variation separately. In addition to standard data such as page views, the winner of the test is also displayed, including the improvements collected over the baseline. The individual variations are also checked against the set goals. This allows, in conjunction with further Google Analytics data, easy identification of which day and time each variation improved. Fig. 13: Evaluation of the results of an experiment. Source: https://support.google.com/360suite/optimize/answer/6218117?hl=en. What We Can Do for You Would you like to test individual elements of your website to increase conversions, leads, or transactions? Start with the free version of Google Optimize to quickly and easily conduct individual experiments. We are happy to advise you on the implementation and evaluation of corresponding tests for conversion optimization, in connection with your set website goals. Contact us.
What clients can expect from your agency in web analytics support
Feb 2, 2017

Thorsten
Abrahamczik
Category:
Web Analytics

Web analytics, what exactly is it? For many of our clients, web analytics means using a tool that receives data that is barely interpretable and possibly invalid. The operation quickly becomes overwhelming, so the tool is used only once every few months. This happens whenever management wants to see data about the website or when it's necessary to justify why the planned marketing budget for the next year is so high. But web analytics is much more than that! Web analytics means questioning interactions on your own website, evaluating the behavior of individual target groups, creating customer journeys, and much more. Ultimately, from all these evaluations and results, an action plan should be created that enables you to specifically optimize your website. In this article, we would like to explain what you, as an online marketer, can expect from an agency that supports you in the area of web analytics. We will discuss our experiences and explain how we approach the topic of web analytics with a client. Fig. 1: Cross-device web analytics | Source: http://bit.ly/2kxlWF4 Phase 1 – Reviewing the Status Quo in Tracking If you approach us as a potential new customer to talk about web analytics, we first describe what web analytics means to us: Web analytics is the foundation of all online marketing activities and provides the basis for all marketing-related activities, both online and offline. It collects both quantitative and qualitative data. When used properly, web analytics describes what happens due to customer interactions and, most importantly, why these interactions occur. Furthermore, it places your own data, if available in benchmarks, in relation to the data of other companies or industries. This shows where you can improve as a company. But more than anything, web analytics is one thing: continuous. Web analytics is very comprehensive and requires intensive support. However, it is not rocket science and can be easily implemented. To ensure this, we first check your current tracking implementation. This way, we obtain a status quo of your web analytics and can better assess where problems and potentials exist. Reviewing the Configuration of the Web Analytics Tool First, we go into the tracking tool and check which data is flowing in, how the tool is configured, and whether the data is valid. Most of our clients use Google Analytics as their tool, making the verification relatively straightforward and almost standardized. Nevertheless, we also look for peculiarities in the data and configuration. Examples include: Are filters being used correctly? Is there spam in the data that distorts evaluations? Is Google Analytics linked with other services like AdWords, Search Console, etc.? Are the data from internal searches being collected? Are demographic data activated? Have goals been set up in Google Analytics? As you can see, there is much to consider when configuring Google Analytics. Especially if you feel uncertain as a marketer and cannot precisely assess which setting causes which impact, key figures can easily be collected or interpreted incorrectly. A classic example is excluding your own accesses through an IP address filter. When we check the filter configurations in Google Analytics, it is incorrectly set up and does not function in 95% of the cases. Our staff member Bettina Wille has written an extensive article about what you need to watch out for when configuring filters in Google Analytics. We also check if you are using reports. These are generally represented in dashboards, radar events, or custom reports, so verification is easily feasible. Checking the Technical Implementation in the Source Code Once we know how your Google Analytics is configured, we check the implementation in the source code. This is not only about seeing whether the tracking code is implemented but also about how it is implemented. Below is a sample selection of aspects we check: Which version of the Google Analytics tracking have you implemented? Has the tracking code been fully implemented according to the configured Google Analytics settings? Are there pages where the tracking code is not implemented? Are multiple tracking code implementations present so that data is collected twice? Are specialties such as cross-domain tracking, e-commerce tracking, etc., implemented correctly? Are additional tracking features being used, e.g., custom dimensions, event tracking, etc.? Are referrers being correctly passed on and is direct traffic indeed just direct traffic? Reviewing Data Protection Finally, we check in individual areas whether your Google Analytics is installed in compliance with data protection regulations. Please note, however, that we are not a law firm and, therefore, cannot provide legally binding advice. Nonetheless, there are features that can be easily checked, such as: Are you anonymizing the user's IP address? Do you have a privacy page that mentions the use of Google Analytics? Is there an option for users to opt out of tracking, both via a browser plugin and a functioning opt-out cookie? Have you signed a data processing agreement with Google? Phase 2 – Developing a Tracking Concept In this article, we have so far focused exclusively on implementing with Google Analytics. Of course, there are also other types of web analytics with additional tools for things like A/B testing, surveys, etc., which also need to be reviewed. However, we do not wish to go into these tools individually in this article. After we have worked out a precise overview of your current tracking and activities in the area of web analytics, we move on to developing a tracking concept. In this, we describe your current status quo and define in great detail what tracking should be used in the future. We also determine jointly with you which data should be collected beyond standard tracking, which website goals should be examined, which KPIs should be defined, and how the configuration should be implemented. Figure 2 shows an excerpt from the table of contents of a tracking concept: Fig. 2: Together with you, we develop a tracking concept. Analysis and Research of Tracking Opportunities by the Agency At the beginning, we explore, based on the status quo analysis, possibilities for enhanced tracking on your website. It is important for us to only collect data that is valuable to you. We can also recommend tracking where we track everything and nothing, so to speak. In this case, we collect a lot of data, which leads to the point where you can no longer analyze it because the data volume is simply too large. Of course, that is not our goal. From our perspective, it requires a precise measure where you can work well with the key figures and receive all the information you need for your evaluations. Aligning Tracking Goals with the Client For the reasons mentioned above, we engage in very intensive exchanges with you during this phase. Here, there are many conversations with different people in your company to understand your requirements for web analytics better. Some of the topics we discuss with you include: Company goals Website objectives KPIs already in use Previous internal reports Hierarchy levels in the company including different reports for different contacts Cooperations with third-party providers Differences between reporting and web analytics Motivation for using Google Analytics Defining KPIs and Goals for the Web Analytics Through these discussions, we not only learn about your requirements but can also better assess which topics in web analytics are particularly interesting for you. Based on this, we can provide targeted recommendations for tracking implementation, KPIs, goals, etc. At this point, you do not yet know our final draft/proposal. Discussion of the Draft Concept with the Client and Approval from the Client Once we have worked out your tracking concept, there is a joint meeting with you in which we discuss the tracking concept with you in detail. This is especially important as the concept is partly very technical. However, it is essential that you completely understand the concept. If you have any requests for changes at this stage, we will discuss them and, if necessary, incorporate them into the concept. Phase 3 – Implementation of the New Tracking Once you have approved the tracking concept, we begin implementing the tracking. Here, we always start with the technical implementation before starting the configuration of Google Analytics. Using the Latest Technologies When we receive an order to implement tracking, we naturally always use the latest technologies. This includes using the Google Tag Manager as well as Universal Analytics. If specific reports are desired, we recommend the client use Google Data Studio. Google Tag Manager As we have already written in previous articles about the Google Tag Manager, the entire tag management can be easily handled, both for Google Analytics and for other tools like AdWords, third-party providers, etc. A major advantage of the Tag Manager is that in very few cases does the IT need to adapt the source code. The tasks associated with managing tags can then be carried out directly by the marketer in the tool. We have already described the functionality of the Google Tag Manager in a comprehensive article. Universal Analytics Universal Analytics is the current version of Google Analytics. The data collection of Universal Analytics differs slightly from the old asynchronous Google Analytics and offers additional benefits such as Enhanced E-Commerce, UserID, etc. Google announced in 2015 that they would cease to support asynchronous tracking in the future. When support will be discontinued is still unclear. Data Studio and Other Offerings With the Google Data Studio, another tool from the Google 360° Suite has been made available as a free version. Since the beginning of February, the tool allows you to create as many reports per email address as you like, the only limitation being that the Double-Click connector cannot be used in the free version. Otherwise, it has the same functionality as the paid version of the 360° Suite. Particularly interesting for marketers is that the reports can be provided with their own CI. This is especially useful when the reports are forwarded to management. Collaboration with IT or the IT Service Provider Once we start implementation, usually only a few adjustments need to be made to the source code. The old tracking remains intact for the time being. This ensures that the main tracking is not affected. For all changes to the source code, the IT or service provider receives precise instructions from us on what needs to be changed. Generally, the Google Tag Manager code needs to be embedded, but often also the opt-out cookie. If e-commerce is being used or conversion values should be dynamically transmitted from the website, adjustments are also required for this. The entire configuration of your new tracking is then implemented by us in the Google Tag Manager. This means we set up the Universal Analytics tag and create variables, triggers, and other tags to fulfill the specifications from the tracking concept. If we also support the client in other online marketing areas like Google AdWords, we implement this directly in the Google Tag Manager as well. This allows for easier management in the future. Throughout the entire phase, we work closely with you and your IT to ensure a correct implementation. If questions arise on the IT side, we offer advisory support. If we maintain your website/content management system, the adjustments are naturally carried out by us. Configuration of the Used Tools With a delay to the basic setup of Universal Analytics in the Google Tag Manager, we set up a so-called test property in Google Analytics that we use to test the data we collect. If we did not do this, we would have to direct the new data into your main property. This would lead to data distortion and problems with clarity that we want to avoid. Only when we know that all data is correctly transmitted from the Google Tag Manager to Google Analytics do we configure the Google Analytics of the main property. This is because some configurations are only possible when certain tags are configured in the Google Tag Manager. Once the data is correctly transmitted by the Google Tag Manager, and Google Analytics is correctly configured, we inform IT that they can remove the old tracking code. Once this has occurred, we adjust the tracking in the Google Tag Manager so that the data no longer flows into the test property but into your main property. This way, we ensure that your old metrics in Google Analytics are not lost and that you can compare your new data with the old data. It should be noted that your old data may not be valid. Setting Up Reporting Part of the configuration of Google Analytics and Google Data Studio may include setting up reports. These are created according to the specifications of the tracking concept. Client Training in the Use of the Tools Once the entire tracking has been implemented, you need to understand and be able to apply the entire implementation independently. For this reason, we conduct an introduction to the implementation, depending on your previous knowledge. This includes all tools (Google Tag Manager, Google Analytics, and Google Data Studio). This is purely an introduction to the implementation. If you as a client have very little experience in Google Analytics and, for example, no experience with Google Tag Manager or Google Data Studio, we recommend a thorough web analytics training. In this, we explain to you not only all the tools but also the basics of web analytics. Depending on the scope, this lasts between one and two days. Learn more about our Google Analytics training sessions. Phase 4 – Ongoing Support So that the new metrics from Google Analytics don't go unused after setup, we support our clients after implementation with ongoing support. This is generally done in two ways. On the one hand, we ensure that the data will continue to be collected correctly. On the other hand, we conduct more in-depth analyses. Regular Analyses and Creation of Evaluations by the Agency This allows you to focus on your reports while we conduct comprehensive web analytics for you. Here, we focus specifically on individual subareas. For example, we analyze the behavior of individual target groups, examine the behavior of users in specific areas of the website, or, alongside a custom channel attribution, develop a basis for an attribution model. In a second step, we further develop this together with you so that you can understand exactly how users come into contact with your company and how you might optimize individual channels to target users more precisely. We also support you in areas such as A/B and multivariate testing. As these tasks are time-consuming but also very productive, our clients gladly take advantage of our expertise and leave these tasks to us. Continuous Support in Case of Problems This topic primarily involves technical support. It often happens that errors in data collection occur due to website adjustments. This happens, for example, through renaming individual buttons, removing website elements, or a complete adjustment of a website element. Generally, IT doesn’t consider during implementation that changes might affect tracking. These errors usually become apparent to us quickly, and we can correct them in consultation with you. What Can We Do for You? Would you like to check your tracking, significantly expand it, or simply outsource the evaluation of the data to a third party? Do you want to conduct targeted optimizations with your web analytics and thus increase your leads? Contact us , and we will be happy to advise you on how we can optimally collaborate with you.
Check Google Analytics Implementation with Screaming Frog
Sep 15, 2016

Thorsten
Abrahamczik
Category:
Web Analytics

Modern websites and content management systems have become very complex, making it challenging for users to make technical adjustments. From our day-to-day business, we know that online marketers often encounter issues, especially with tracking. But only through valid tracking can you generate clean data in the analytics tools. Only then are qualitative analyses possible. In this article, we present a method that allows you to easily check whether the Google Analytics code is installed on all pages. Furthermore, we will explain how to verify the code implementation. Screaming Frog – The Tool for SEO A well-known SEO tool is the Screaming Frog program. The main task of the tool is to crawl websites. For this purpose, a crawler (also referred to as a bot) is sent to the respective website to gather information on all subpages and their contents. Marketers can thus easily check to what extent the website has potential for SEO. In the free version, up to 500 website elements (HTML pages, images, CSS files, etc.) can be analyzed. The paid version is much more powerful and crawls all pages. Besides standard reporting, the tool offers the possibility to connect with external services such as Google Search Console and Google Analytics to obtain even more accurate analyses. Additionally, it also provides the option to conduct investigations on the website with custom filters. Essentially, this involves two different methods: Search: The affected pages are listed with URLs in Screaming Frog. Extraction: In this case, the desired content of the affected pages is displayed by Screaming Frog. Configuration of Screaming Frog Figure 1 shows Screaming Frog immediately after opening the program. By typing a URL in the bar and clicking on Start, the corresponding page or domain is immediately crawled, and initial information about the URLs flows into almost all tabs. The "Custom" tab will be important for you later. Fig. 1: The structure of Screaming Frog Identifying Pages Where the Google Analytics Code Is Missing If you want to identify pages or URLs where, for example, the Google Analytics code is not embedded, click on the "Search" button as shown in Figure 2. Fig. 2: Using custom filters in Screaming Frog In the subsequent dialog, Screaming Frog offers you ten different filters. To search for pages where Google Analytics is not installed, you only need one filter. Set the first field to "Does Not Contain" and enter the UA number of your Google Analytics property. You can also make other inputs there, but it's important that it’s something unique from the Google Analytics code. Also ensure that the entry does not match any other element in the source code. From our point of view, the Google Analytics UA number is a good choice. Fig. 3: Using the search filter in Screaming Frog If you want to check whether the Google Tag Manager is correctly embedded, you could enter "GTM-XXXXXX" as the container's ID. In this case, you are also using an element unique to the Google Tag Manager. Subsequently, under the "Custom" tab, all pages where the corresponding search term is not found will be listed. This way, you can easily identify which pages still need adjustments to achieve complete tracking. You can also go to IT with a specific action plan in this manner. Checking the Google Analytics Property ID on All Pages In the event that Google Analytics is installed on all web pages, you should additionally verify that the correct UA number is used. Mistakes can easily occur, and then you also don't have valid tracking. The "Extraction" method is perfect for this. Click on "Extraction" as shown in Figure 2. In the following dialog (Figure 4), enter "Google Analytics UA number" as an example in the first field. This field is used to name the corresponding column in the "Custom" tab. Fig. 4: Setting an extraction filter to read the value of the Google Analytics UA number In the second field, select "Regex". Regex is short for "regular expressions" and provides the opportunity to identify exact letter, number, and character combinations within a given text area, e.g., the source code of a webpage. To find Google Analytics elements on the page, use the following regex: ["'](UA-.*?)["'] This way, you can see which Google Analytics UA number is used for each page of your domain. You can easily spot any typos and correct them afterward. For the Google Tag Manager, you would use "["'](GTM-.*?)["']" as a regex filter in this case. In the next figure, you can see how the results are then displayed in the "Custom" tab. Fig. 5: "Custom" tab Conclusion Screaming Frog offers strong expandability with custom filters besides its diverse SEO analysis capabilities. Here, users can also check non-SEO content and gain insights that greatly simplify daily work. What Can We Do for You? Are you unsure if Google Analytics is correctly embedded on your site or wonder if Google Analytics tracking can be extended beyond standard tracking? Do you want to measure conversions even more accurately? Contact us, and we will be happy to advise you on checking existing tracking, creating a tracking concept, or implementing tracking. We look forward to your inquiry .
Set filters in Google Analytics
Jul 28, 2016

Thorsten
Abrahamczik
Category:
Web Analytics

Google Analytics is the most popular web analysis tool. As a website operator, you can understand the behavior of your users on the site and subsequently define appropriate measures for page optimization. To ensure that you only find relevant data in your evaluation, it's wise to set certain filters. Why are filters useful in Google Analytics? Google Analytics captures all user data on the website without filtering. However, it is not always 100% effective to use all values for an analysis. With the help of filters, you can initially channel the data through a kind of sieve so that your data view only shows the values you need. Generally, filters can be categorized into the following main categories: Inclusive filter: only a defined filter pattern gets through Exclusion filter: the defined filter pattern is excluded Search & Replace: designations recorded by Google (e.g., Homepage = "/") are rewritten Lowercase: Reduction of duplicate results by using all lowercase letters A common example of setting a filter is the exclusion of an IP address range. This allows you to exclude data from your company or agency network if it uses a static IP. The same applies to your IT service provider who regularly visits the site with a static IP address, without it being relevant for site optimization. This filter is not possible with dynamic IP addresses as they change every 24 hours. Access from spam or bots can partly be excluded by filters. Many referral URLs are now known and published on portals that go to the site as spam and skew your traffic. Check the referrals that lead users to your website. It is typical to see 1 page/session, 0:00 time on site, and an almost 100% bounce rate. When researching, you can get an overview of the referral URLs via the left navigation bar under Acquisition, All Traffic, Referrals. Fig. 1: Display referrals in Google Analytics But you can also detect spam behavior in the reports of Google Analytics under "Locations", "Website Content", and "All Traffic". A common example of spam access is the following: Fig. 2: Example of spam access How to exclude the IP address range for your corporate network If you want to check user behavior with the web analysis tool Google Analytics, accesses from your own network are obstructive. Internal page hits should not be equated with the character of a browsing user. To prevent accesses from your own ranks from appearing in your data, set a filter as follows: Navigate to your Google Analytics account and open the admin view: Fig. 3: Admin view in Google Analytics In the right column of the data view, you will already see the "Filter" field. Here you can now add a new filter. Configure the filter. Give it a meaningful name, select "predefined" as the filter type, and use an "Exclude" filter. Choose the "begins with" command and enter the start of the IP address to be excluded in the last field. You should never use the "equals" command. Due to the data protection-compliant anonymization of the IP address, the fourth/last part (the last 3 digits) of the IP address is always truncated, so the complete sequence of numbers never reaches Analytics. Fig. 4: Filter to exclude IP addresses How to exclude referral URLs If you found out through checking your referral URLs that certain accesses are done by bots and thus flow into your analysis as spam, you should create a filter to exclude this data. First, collect a list of these URLs or research a list and also create a custom "Exclude" filter. Give it a meaningful designation and use the referral as the filter field. Then enter all the researched URLs into the filter pattern. Separate the individual URLs with a pipe (|). Add a backslash () before EACH dot, so the dot does not execute a command as a regular expression. Avoid spaces in the filter pattern entirely. Unfortunately, Google Analytics provides only 255 characters in the filter pattern, so you may need to set up multiple filters to exclude referrals. Test filters before applying Once you've set the filters, they are immediately active. However, you cannot regain filtered data or reverse the sieving effect. We therefore recommend testing the filters in a test view before applying them to the live environment. Also use a third data view that collects all your website data without filters. This way, you can still reconstruct the correct values in case of possible errors. How you should configure the data views can be found in our article Google Analytics Basic Configuration – What You Should Pay Attention To . What we can do for you The proper use of filters cleans up your data so that you can draw meaningful results from analysis with Google. If you need support with Web Analysis , we look forward to your inquiry.
The recording feature of Google Tag Assistant
Jun 30, 2016

Thorsten
Abrahamczik
Category:
Web Analytics

Many companies incorporate Google Analytics, a free web analysis tool, into their website to get an overview of their website's visitor numbers. However, this is often where the problems begin. In many cases, the tool or its code is simply copied into the page's source code. This does not take into consideration that the Google Analytics code must be placed in a specific spot, nor that the IP address needs to be anonymized. It is also seldom considered that in certain cases the referrer, meaning the reference/link through which a user came to your site, is not correctly passed on. In this case, the website visitor is recorded as a direct entry and not as a visit via a referral. Therefore, users are working with skewed numbers. To identify and avoid these issues, Google offers a practical browser extension for Chrome called Google Tag Assistant, as mentioned in the blog article The Best Tools for Successful Entry into Web Analysis . The Functions of Google Tag Assistant First, Google Tag Assistant provides the ability to display information about all Google tags. These can typically include the following tags, as shown in Screenshot 1, which also shows the button for the record function. Google Analytics, Remarketing, Conversions Tracking, DoubleClick, etc. Figure 1: Overview of Google Tag Assistant For each individual tag, you can display detailed information about the current status of the tag. A color-coding system further indicates whether the tag is correctly implemented (green), if there are slight deviations from the implementation recommendations (blue or yellow), or if there are significant problems causing errors in tag execution (red). The level of detail in the provided information can be configured in the Tag Assistant settings for each tag type. The Record Function As illustrated in screenshots one and two, a record function has been added to the tool with Google Analytics in mind, which allows for precise analyses across multiple page views. This enables you as a user to easily determine whether your data is being correctly captured and processed for Google Analytics. The advantage of this method: data is measured individually for all executed Google Analytics tags. In addition to page views, you can, for example, also examine event tracking or e-commerce tracking. This way, you can find errors that you would otherwise only uncover with significantly greater effort. Figure 2: Recording has begun Using this, we were able to discover an error in a customer's cross-domain tracking. It was correctly configured for page view tracking, but not for several event tags. This led to the referrer being correctly passed during page views of the second domain, but not for any event tags, where the fallback "Direct/None" was used. This verifiably distorted the metrics in Google Analytics. After a corresponding adjustment to the event tags, the appropriate referrers were correctly passed on, and the metrics were correctly integrated into Google Analytics once more. An advantageous feature is the ability to directly link Tag Assistant recordings with Google Analytics. In the analysis area of Google Tag Assistant, you can selectively choose individual data views of the Google Analytics account. You can also send specific location data to check whether certain data, for instance due to IP address filters, does not integrate into Google Analytics. In such cases, you can make the necessary adjustments in your Google Analytics configuration and immediately update the recording with a click of the refresh button. A new recording is not needed to see the results of the adjustments immediately. Screenshot 3 shows a section of the analysis area: Figure 3: Overview of the Google Analytics Record Function Analysis in Google Tag Assistant What We Can Do for You At internetwarriors, the Google Tag Assistant with its record function is one of the standard tools in the field of web analysis and SEA. We would be happy to review your Google implementation as well and identify any potential errors. If desired, we can also fix them. Improve the quality of your analyses with more accurate metrics and, correspondingly, your marketing budget allocation. Contact us.
Show more posts
Transparency in Google Ads: How to Properly Utilize Performance Max Channel Reporting
Oct 10, 2025

Josephine
Treuter
Category:
Search Engine Advertising

Google Ads is one of the most efficient ways to increase a company's reach and achieve targeted conversions. However, in times of AI and automation, the way campaigns are managed and evaluated is also changing. With the introduction of Performance Max campaigns, Google has created a new approach: all channels, from Search to YouTube to Shopping, are bundled into a single, fully automated campaign. This promises maximum efficiency, but at the same time makes it more difficult to trace through which channels the conversions are actually generated. For a long time, it was unclear which channel contributed what to the campaign's performance. Those who needed this information had to resort to technical scripts and complex workarounds - an effort that overwhelmed many teams. With the new Channel Performance Reporting, this changes fundamentally, allowing results to be evaluated per channel. In this article, we'll show you how to make the most of the new reporting, which best practices have already proven themselves, and how to make better decisions with more transparency. As an experienced Google Ads agency, we provide you with practical tips directly from everyday life at internetwarriors. The Essentials in Brief Performance Max bundles all Google channels into one campaign. The Channel Reporting now provides the necessary transparency. You can see how Search, Display, YouTube, Discover, Maps, and Gmail perform individually. The reports can be segmented by ad format, status, or targets like CPA or ROAS. The new reporting allows you to identify optimization potentials more quickly and control them more precisely. The status section helps with technical issues and offers recommendations for action. What Exactly Is a Performance Max Campaign? The Performance Max campaign , or PMax for short, is an automated campaign format in Google Ads available since 2021. It allows ads to be played simultaneously across multiple Google channels such as Search, Display, YouTube, Gmail, Discover, and Shopping, all in a single campaign. Unlike traditional campaigns, PMax relies on Google AI for ad delivery and optimization. Based on goals such as conversions or revenue, the system independently decides which ad to show to which user on which channel. For advertisers, this means less manual control and more focus on high-quality assets and strategic goal setting. With the new Channel Performance Reporting, it is now finally visible which channel contributes what to the overall performance, and this is an important step toward more transparency and control. Why Transparency in a PMax is so Important Performance Max campaigns offer many advantages: They bundle all Google channels into a single campaign, use AI for automated delivery, and promise maximum efficiency. However, this very automation brings a central challenge: a lack of transparency. It was long unclear through which channel a conversion actually occurred. This was a problem for anyone wanting to optimize their campaigns based on data. Without channel-specific insights, it is difficult to make informed decisions: Should more budget flow into YouTube or Search? Do video ads work better than text ads? Which audiences perform on which platforms? The answers to these questions are crucial for effective campaign management, and this is where the new Channel Performance Reporting comes in. It provides the necessary transparency to evaluate the performance of individual channels, identify optimization potentials, and strategically manage budgets. For agencies like internetwarriors, this is an important step to not only deliver results to clients but also develop transparent strategies. How to Find Channel Reporting in Your Google Ads Account The new Channel Performance Reporting for Performance Max is currently still in beta. This means that the feature is being rolled out gradually and may not be immediately available in every Google Ads account. The scope of the displayed data can also vary depending on the account, ranging from basic channel metrics to detailed conversion insights. If your account is already enabled, you can find the reporting directly in the Google Ads interface under: Campaign Overview → Select Performance Max Campaign → Insights → Channel Performance There, you will receive a detailed breakdown of important metrics such as impressions, clicks, conversions, costs, and ROAS. The view can be filtered by time period, device, or conversion goal, providing a valuable basis for data-driven optimizations. What Exactly Does the Channel Reporting Show You? The Channel Performance Reporting provides a structured overview of the performance of individual channels within a Performance Max campaign. It shows how the campaign is distributed across platforms like Search, Display, YouTube, Gmail, Discover, and Shopping, and what each channel's share of the achieved conversions is. This transparency allows an informed evaluation of budget distribution, identifies underperforming channels, and assists in prioritizing future investments. Additionally, the reporting offers extensive segmentation and filtering options. The data can be analyzed by key metrics such as Cost per Acquisition (CPA), Return on Ad Spend (ROAS), or Click-Through Rate (CTR). This provides a comprehensive view of the campaign's performance, both cross-channel and data-driven in a strategically usable way. What Can Be Learned from the Data The Channel Performance Reporting delivers far more than just numbers. It opens up new perspectives for the strategic management of Performance Max campaigns. By breaking down key figures like impressions, clicks, conversions, and costs by channel, it becomes visible which platforms are genuinely contributing to achieving targets and how the deployed budget is distributed. This data enables an informed assessment of the used ad formats, targeting methods, and device distribution. Conclusions can also be drawn regarding the customer journey and potential optimization potentials can be identified, for example, in the design of assets or budget allocation. For agencies like internetwarriors, this transparency is a valuable foundation for not only optimizing campaigns efficiently but also communicating transparently with clients. How to Optimize Your Campaigns with the New Insights The channel-specific data from the Channel Performance Reporting provides a valuable foundation for the strategic optimization of Performance Max campaigns. By analyzing individual channels, it becomes apparent which platforms work particularly efficiently, where wastage is occurring, and which ad formats achieve the best results. Based on this, budgets can be distributed more strategically, assets can be designed more precisely, and target groups can be addressed more diversely. Furthermore, the insights enable a more precise evaluation of the customer journey: Are users addressed via YouTube but convert only via Search? Such patterns can now be comprehended and incorporated into the campaign structure. The selection of conversion goals can also be newly assessed based on the data to further align campaign orientation with actual user behavior. Limitations and Pitfalls of Channel Reporting Even though the Channel Performance Reporting represents an important step towards transparency, current limitations and pitfalls should not be neglected. Since the feature is still in the beta phase, availability is not guaranteed across the board, and the scope of displayed data can vary from account to account. In some cases, only aggregated values are displayed, without deeper insights into individual ad formats or audiences. Moreover, it should be noted that Performance Max operates cross-channel, and the individual channels do not stand alone but work collectively. A channel with seemingly weak performance can nevertheless make an important contribution to conversion, for example, through early user engagement in the funnel. Therefore, interpreting the data requires a holistic understanding of the customer journey and shouldn't rely solely on individual metrics. Technical limitations such as incomplete conversion attribution, missing asset data, or limited segmentation options can also complicate analysis. Therefore, a combination of Channel Reporting, conversion tracking, and supplementary tools such as Google Analytics or server-side tracking is recommended for a sound evaluation. Conclusion: More Control, Better Decisions With the new Channel Performance Reporting, a decisive step toward transparency within Performance Max campaigns is taken. The ability to evaluate channel-specific data directly in the Google Ads interface provides a solid basis for strategic decisions and targeted optimizations. Even though the feature is still in the beta phase and not fully available in every account, it is already clear how valuable these insights are for modern campaign management. The combination of automation and data-driven control makes it possible to distribute budgets more efficiently, use assets more targetedly, and better understand the customer journey. For agencies like internetwarriors, this means: more clarity in analysis, better arguments in customer communication, and significantly increased effectiveness in digital marketing. As an experienced Google Ads agency, we help you harness the full potential of your Performance Max campaigns. We assist you not only with setting up and optimizing your campaigns, but also with the targeted use of the new Channel Performance Reporting. This way, you'll gain clear insights into the performance of individual channels, can distribute budgets sensibly, and make data-based decisions. With our expertise in AI-supported campaign management and cross-channel analysis, we ensure that your ads not only perform but are transparent and traceable. Get in touch with us!
VKU Marketing Experts 2025 – AI in Focus
Oct 8, 2025

Axel
Zawierucha
Category:
Inside Internet Warriors

On September 24, 2025, Berlin was the hotspot for marketing experts from public utilities. The VKU Marketing Experts Congress provided an excellent platform to discuss the industry's most pressing issues. This year's top topic: the unstoppable rise of artificial intelligence in marketing. As internetwarriors, we were there, represented by our experts Julien Moritz (SEO/GEO expert) and Axel Zawierucha (CEO), to share our knowledge and gain new insights. The transformation is now: AI as a game-changer The atmosphere at the congress was marked by a palpable sense of optimism. Numerous lectures and discussions made it clear that AI is no longer just a buzzword but a tangible tool that is revolutionizing marketing strategies. From personalized customer engagement to automated content creation and data-driven forecasts – the possibilities seem endless. However, with new opportunities come new challenges. One of the central questions that arose in many conversations was: How can companies remain visible in a digital landscape dominated by AI systems and language models (LLMs) and effectively reach their target audiences? Our workshop: Visibility in the age of AI We dedicated our interactive workshop to precisely this question. Under the title "Visibility in the AI Era: How to Position Your Business in New Systems," Julien Moritz and Axel Zawierucha provided practical insights and strategic advice. The interest was overwhelming. Intense discussions with participants made it clear that many companies are seeking guidance on how to prepare their content and data to be optimally captured and presented by AI-based search and recommendation systems. We demonstrated how a well-thought-out data strategy and content optimization for semantic searches can make a significant difference. The many exciting questions and the enthusiastic participation showed us that we struck a chord here. How we as GEO specialists can support Especially in a local context, geographic visibility is crucial. As GEO specialists, we help you strengthen your presence in local search systems and map applications – an important factor to be found even in AI-driven environments. With structured location data, local SEO, and targeted integration into semantic search systems, we ensure that your offerings appear where your target audience is searching – today and in the AI-driven future. Contact us!
DMEXCO 2025: CRM, AI and the Future of Search Engine Marketing
Sep 24, 2025

Axel
Zawierucha
Category:
Inside Internet Warriors

DMEXCO 2025 in Cologne was more than just a trade show for us at internetwarriors.de – it was a vibrant marketplace of ideas, a melting pot of innovations, and above all, a confirmation of the topics that move us and our customers every day. With a record attendance of over 40,000 participants and under the motto "Be Bold. Move Forward.", this year's leading trade fair for digital marketing sent a clear signal: The future belongs to the bold, the pioneers, and those who are ready to blaze new trails. In countless inspiring conversations with customers, partners, and industry colleagues, a common thread emerged for us, connecting the central challenges and opportunities of our time: the inseparable linkage of Customer Relationship Management (CRM), the revolution through Artificial Intelligence (AI), and the redefinition of campaign planning in the era of generative models. The foundation of successful performance campaigns: The CRM feedback loop A theme that repeatedly came to the forefront in our conversations at DMEXCO 2025 was the immense importance of deep integration of CRM systems in performance marketing campaigns. It's a realization as simple as it is crucial: those who want to successfully generate leads must not merely scratch the surface. The mere generation of contact information is only half the battle. The true value unfolds only when a seamless feedback loop between marketing and sales is established. This is where CRM comes into play. It is the centerpiece that consolidates all relevant information about a potential customer and tells us what actually became of a generated lead. Was the contact qualified? Did it lead to a sales conversation? Was a contract concluded? This feedback is pure gold for optimizing performance campaigns. Without this feedback, we operate blindly. We see which ads and keywords generate clicks and conversions, but we don't know which truly lead to revenue. From our extensive practical experience and the intensive discussions at the fair, we can make a clear recommendation. As an official implementation partner of Teamleader in Germany, we have gained deep insights into the capabilities of modern CRM systems. We are convinced that Teamleader unites all critical features to conduct business successfully. From the central contact database, deal tracking, and project management to time tracking and invoicing, the platform offers an all-in-one solution specifically tailored to the needs of agencies and service-oriented SMEs. Seamless integration enables precisely the valuable feedback loop essential for data-driven marketing. The discussions at DMEXCO showed that companies that have successfully closed this loop deploy their marketing budgets much more efficiently. They can target their campaigns on the channels and audiences that deliver the most valuable customers. In an era where digital advertising costs are steadily rising, and competition is becoming more intense, this data-driven precision is no longer a "nice-to-have," but an absolute must for sustainable success. Google in transition: The future of search in the age of AI Of course, the future of AI and Google, along with organic search, was one of the dominant topics in the Cologne exhibition halls. The era of purely keyword-based searches is coming to an end. Generative AI models and the increasing integration of AI into search engine result pages (SERPs) signal a paradigm shift. The question everyone is asking is: How will search change, and what does it mean for our SEO and SEA strategies? The keynotes and expert lectures at DMEXCO painted a clear picture: search will become more contextual, dialogue-oriented, and personalized. Users no longer just expect a list of links, but direct answers and solutions to their concerns. Google's "Search Generative Experience" (SGE) is just the beginning. The ability to understand complex queries and answer in full sentences will fundamentally change how we search for information. For us as an agency, this means we need to adapt our content strategies. It's no longer just about optimizing individual keywords but creating comprehensive thematic worlds that holistically answer users' questions. "Topical Authority" will become the new currency in SEO. We must become the experts in our niche and create content that offers real value both for users and AI-driven algorithms. At the same time, AI also opens new possibilities for paid search. Performance-Max campaigns are a good example of how Google uses AI to automate and optimize the display of ads across the entire Google network. The challenge for us marketers is to provide the AI with the right signals – and{
Marketing in the Age of AI: Welcome to the New Reality
Sep 5, 2025

Axel
Zawierucha
Category:
Artificial Intelligence

A specter is haunting the marketing world – the specter of artificial intelligence. But instead of spreading fear, it brings a wave of transformation that redefines the very foundations of our industry. Gone are the days when marketing relied solely on intuition, manual segmentation, and broad campaigns. Today, in 2025, we are in the midst of a revolution driven by algorithms, machine learning, and Large Language Models (LLMs). For us at internetwarriors, it's clear: AI is not a passing trend but the new operating system for successful marketing. But what does that mean exactly? What has really changed? How do you need to adjust your strategies to not just survive but thrive? And how is perhaps the most important component of all – user behavior – changing? This article is your comprehensive guide to marketing in the age of AI. We dive deep into the changes, show you practical strategies, illuminate new user behaviors with current research insights, and look beyond to see which future trends from the USA and Asia will soon be our reality. The new playing field: What AI has fundamentally changed in marketing Artificial intelligence is more than just another tool in your toolkit. It's the invisible hand that optimizes processes, provides insights, and enables interactions at a speed and precision that were pure science fiction just a few years ago. The core changes can be observed in four key areas: 1. Real-time hyper-personalization: Previously, personalization was addressing a customer by their name in an email. Today, personalization means presenting the user with exactly the content, product, or message that matches their current need – across all channels. AI algorithms analyze massive amounts of data from user behavior, purchase history, demographic information, and even contextual data (such as weather or location) in milliseconds. The result: Dynamic web content, personalized product recommendations in online stores, and individually tailored ads perceived not as intrusions but as relevant services. 2. Predictive analytics and data-driven forecasts: Marketing has long been reactive. We analyzed past campaigns to optimize future ones. Marketing AI reverses this principle. Predictive analytics models can predict with high probability which customers are most likely to churn (Customer Churn), which leads have the highest purchase probability (Predictive Lead Scoring), or which products will sell best next season. This foresight allows you to act proactively, distribute budgets more efficiently, and focus your resources on the most promising segments. 3. Automation of content creation and distribution: Generative AI has revolutionized content creation. Tools like ChatGPT, Jasper, and more advanced, industry-specific models can now create high-quality texts for blogs, social media posts, emails, or product descriptions. But it goes far beyond that: AI systems can also generate images, videos, and even music. For you as a marketer, this means a significant increase in efficiency. Routine tasks that used to take hours are now done in minutes. At the same time, AI enables the creation of content for A/B tests in countless variants and the automatic distribution through the right channels at the right time. 4. Efficiency through intelligent automation: Besides content creation, AI automates countless other marketing processes. From programmatic buying of ad space (Programmatic Advertising) to intelligent bidding strategies in Google Ads to automatic segmentation of target audiences – AI handles repetitive, data-intensive tasks. This not only results in massive time and cost savings but also minimizes human errors and continuously optimizes campaign performance based on data. The marketing strategy 2025: How to successfully navigate the AI era A new technological reality requires a new strategic approach. It's not enough to just introduce a few AI marketing tools. Your entire marketing strategy needs to be rethought in the context of AI. 1. From target groups to "Segment-of-One": Your radical personalization strategy Your central strategy should be hyper-personalization. The goal is no longer to reach a target group but to treat each individual customer as their own segment ("Segment-of-One"). Practical implementation: Invest in a robust Customer Data Platform (CDP) that centralizes all customer data in one place. Use AI-driven personalization engines for your website, online store, and email marketing. These systems dynamically adjust content based on each user's click behavior, dwelling time, and purchase history. 2. Conversational marketing: Dialogue as the new funnel Users no longer want to fill out forms or be stuck on hold. They expect immediate answers and a direct dialog. AI-powered chatbots and voice assistants provide the solution. Practical implementation: Implement an intelligent chatbot on your website that not only answers standard questions but also qualifies leads, books appointments, and guides users through the purchasing process. Train the bot with your corporate data to ensure accurate and brand-consistent responses. 3. Content strategy: Quality and AI optimization hand in hand In the age of AI content creation, the sheer volume of content will explode. To stand out, two things are crucial: first, exceptional, human-centered quality, and second, optimization for AI systems. Practical implementation: Use generative AI as a tool for ideation, draft creation, and text optimization for SEO. However, the final editing, strategic direction, and emotional depth must come from human experts. At the same time, structure your content (e.g., through Schema.org markup) for easy understanding and prominent placement in answers from AI search engines like Google’s Search Generative Experience (SGE). 4. SEO and AI: The symbiosis for your visibility SEO and AI are inextricably linked. Google's algorithms, particularly RankBrain and BERT, are deeply rooted in machine learning. The future of search lies in answering complex queries, not just matching keywords. Practical implementation: Focus on thematic authority (Topic Clusters) rather than individual keywords. Create comprehensive content that fully answers user questions. Use AI tools to analyze SERPs, identify content gaps, and optimize your content for semantic search. Global outlook: These AI trends from the USA & Asia are defining the future While we in Europe are beginning to fully harness the potential of AI, the USA and Asia serve as "future labs." Different regulations, a higher risk appetite, and a deep-rooted "mobile-first" culture accelerate the adoption of technologies that will soon dominate our market. Trend 1 from Asia: The "Super-App" ecosystem & Social Commerce 2.0 In Asia, especially China, with apps like WeChat or Alibaba , "Super-Apps" dominate daily life. In these closed ecosystems, digital life unfolds: chatting, shopping, paying, booking services. AI is the glue that enables a seamless, hyper-personalized customer journey within a single platform . Live-stream shopping on steroids: Forget QVC. In Asia, live streams are interactive events. AI tools analyze viewer comments in real-time to suggest products to the influencer. Group-buying through AI: AI identifies potential buyers with similar interests and connects them into groups to obtain better prices through joint purchases. Trend towards conversational commerce: WhatsApp and Instagram are increasingly moving towards this direction. The trend relentlessly moves toward Conversational Commerce . Your first crucial step into this future is to deploy an intelligent chatbot on your website that not only answers standard questions but also qualifies leads, books appointments, and guides users through the purchase process. Trend 2 from the USA: The autonomous marketing manager In the USA, the trend is moving from AI assistance to AI autonomy . So-called "agent-based systems" are AI-generated virtual managers that not only assist, but autonomously manage campaigns and make strategic decisions. The Autonomous Marketing Manager: Instead of saying, "Create 10 social media posts," the AI will decide whether to write blog articles, launch Google Ads, or start an email campaign on its own. It analyzes the market, target audience, and performance to make these decisions. What does this mean for you? This trend is technologically demanding but will radically change your role as a marketer. Your task will be to centralize your data infrastructure (e.g., with a Customer Data Platform). Only with clean, accessible data can you effectively leverage AI systems to their full potential. Trend 3 from the USA & Asia: AI influencers and the era of synthetic media Virtual, AI-generated influencers with millions of followers and contracts with global luxury brands are emerging. They are precursors to a revolution in content creation. The forebears of change: AI-generated influencers like these herald new possibilities for brands to easily and inexpensively create their synthetic personas, designed to be visually and characteristically tailored to match a brand. What does this mean for you? In a world where AI anticipation of needs reduces tolerance for irrelevance, marketers should focus on transparency and creativity. Instead of replacing actual people, AI avatars can be used as fantasy figures, futuristic ambassadors, or scalable, personalized video tutorials. Transforming the marketer: The new era of AI and marketing AI is changing the expectations and behavior of consumers. They now anticipate immediate responses and direct dialogues in customer interactions. In response to this evolving technological landscape, marketers must evolve their strategies. It is not enough to merely introduce a few AI marketing tools; the entire marketing strategy must be reimagined using AI as a partner rather than a threat. Develop data competence: You don't have to become a data scientist, but you must learn to interpret data to draw valuable insights. Focus on strategic planning: Instead of manually setting up A/B tests, your task will be to set strategic directions while the AI optimizes the methodologies to achieve those strategies. Master creativity and storytelling: In a world where sheer volume of AI-generated content is set to explode, what will make a difference is outstanding, human-centered quality and content optimized for AI systems. Prompt engineering as a new skill: The quality of a generative AI's output now heavily relies on the quality and creativity of the input and command structure you provide. Lifelong learning: In the AI-driven world, continual education becomes a necessity rather than an option, as the technologies and methodologies keep evolving. Conclusion: The future of marketing is a symbiosis with AI The ubiquitous presence of AI is inevitably shaping expectations and consumer behavior. AI is freeing us from repetitive, manual tasks, giving us the capacity to focus on areas where humans are irreplaceable. Success will belong to those who understand this new technology not as a threat but as an ally. The winners will be those who combine the analytical power, speed, and scalability of AI with the irreplaceable human qualities of creativity, strategy, and storytelling. At internetwarriors, we look forward to this future full of opportunities. We see AI as a partner that frees us from repetitive, manual tasks, allowing us to dedicate our capabilities to where humans make the difference.
AI Max for Search Campaigns - How AI is Changing Google Ads
Sep 3, 2025

Markus
Brook
Category:
Search Engine Advertising

Online marketing is constantly evolving, driven by technological innovations. A current example is the introduction of Google's AI Max campaigns. This campaign type is specifically designed for search campaigns and utilizes artificial intelligence to control ads more efficiently. Below, we explain what AI Max for search campaigns is, the benefits it offers, and the requirements it places on advertisers. Key Points AI Max is a new campaign feature in Google Ads that uses machine learning for automated ad placements and bidding. AI Max combines existing Google Ads features such as Broad Match, DSA, and automatically generated assets. The focus is on maximizing conversions and conversion values. AI Max combines traditional search campaigns with AI-driven bidding strategies. Automation reduces management effort but requires clear goals, data, and high-quality assets. Control is achieved through goal definitions and continuous monitoring of campaign performance. Introduction: What is AI Max? Google continually develops its advertising platform, increasingly relying on artificial intelligence. With AI Max for search campaigns , a new campaign feature is introduced specifically designed for Google search. AI Max uses machine learning to automatically control ads, adjust bids in real-time, and increase the likelihood of conversions. The goal is to reduce manual effort and enhance the efficiency of search campaigns. How AI Max Works Unlike traditional search campaigns, Google AI Max heavily relies on automation. Assets, including ad titles, descriptions, sitelinks, or extensions, are provided to the system. The AI combines these components independently and dynamically creates ads that optimally match the respective search query. Additionally, the system continuously analyzes user signals like location, search history, or interaction patterns. This data is used to identify relevant target audiences and optimize ads in real-time. This makes campaign management significantly more precise and faster than manually possible. 1. Keywordless Technology: Search Ads Without Classic Keywords A central element is the so-called “keywordless matching.” Instead of relying on exact or phrase match keywords, Google analyzes landing pages, existing assets, and user behavior with AI to serve appropriate search queries. This is reminiscent of Dynamic Search Ads functionality, but in an even more automated framework. 2. Text Automation with AI The automatically created assets are another building block in AI Max. Google dynamically creates ad texts based on the website, previous ads, and other available data. 3. Final URL Expansion With final URL expansion, Google may direct users to a different target page than originally set if the AI assumes a better conversion probability exists there. This feature is also based on known DSA campaign mechanics. Benefits of AI Max in Google Ads The introduction of AI Max offers several benefits for advertisers: Time Savings Through Automation : Manual adjustments of bids and ad texts are mostly eliminated. Higher Conversion Probability : Google itself states that AI Max can generate up to 14% more conversions on average. Extended Reach : Ads are no longer only triggered by classic keywords but can also cover additional relevant search queries. Transparency : New reporting features show how AI makes decisions and what adjustments were made automatically. Despite the advantages, AI Max also carries risks. Automation can lead to unexpected and sometimes uncontrollable results. For example, AI may serve ads for search terms that do not directly align with the brand core or product, leading to irrelevant traffic and reduced efficiency. Another risk is that performance heavily depends on the quality of the provided assets and the data foundation. If these are faulty or insufficient, the AI may draw incorrect conclusions and steer the campaign in the wrong direction. In the worst case, this could result in wasted marketing budgets without achieving the desired results. Especially for clients with limited budgets and insufficient conversions, we currently do not recommend using AI Max. Challenges and Limitations Reduced Manual Control : Many decisions are taken over by the AI, meaning fewer intervention possibilities. Dependence on Data Quality : The AI can only work effectively if high-quality assets and precise conversion goals are provided. Continuous Monitoring Required : Even automated campaigns must be regularly reviewed and adjusted to be successful over the long term. First Practical Insights: What Companies Achieve with AI Max AI Max is not just a theoretical concept but already delivers real results, as proven by two early case studies from the beta phase that Google itself presents. Both L’Oréal Chile and the Australian provider MyConnect used AI Max and were able to make their search campaigns significantly more efficient. L’Oréal Chile: Higher Conversion Rates at Lower Costs The cosmetics giant used AI Max specifically to identify new keyword potentials and increase the relevance of its ads. With success: the conversion rate doubled while the cost-per-conversion decreased by a whopping 31%. An example shows the potential: The campaigns suddenly targeted search queries like “what is the best cream for facial dark spots” – terms that would likely never have been covered with classic keyword strategies. AI Max thus helped to specifically address relevant long-tail intentions without manual setup. MyConnect: More Leads Through New Search Impulses The Australian company MyConnect was already using Broad Match and tROAS. Nonetheless, activating AI Max brought clear improvements: 16% more leads 13% lower costs per conversion 30% more conversions from novel search terms Particularly intriguing: the strong increase in so-called “net-new queries” – search queries previously not covered by the existing keywords or assets. Here lies the real added value of AI Max: it recognizes opportunities that were not visible before. Best Practices for Using AI Max For AI Max to be successfully used, companies should follow some principles: Provide High-Quality Assets – Diverse ad titles and descriptions make it easier for AI to optimize. Define Conversion Goals Clearly – The more precise the goals, the better the AI can control the campaign. Conduct Regular Analysis – Despite automation, controlling metrics like ROAS, CTR, and conversion rate remains important. Review Brand Keywords – It may be wise to exclude brand terms to reach new target groups instead of just serving existing search queries. Conclusion: Opportunities and Limits of AI Max AI Max for Search Campaigns is a step towards greater automation in Google Ads. Companies can benefit from this technology if they strategically prepare their campaigns, set clear goals, and regularly monitor the results. The AI does not replace a well-founded marketing strategy but complements it. When used correctly, AI Max can help use budgets more efficiently, reduce administrative effort, and enhance performance. If you want to discover how AI Max or other innovative approaches to Google Ads with AI can help your company, we are here for you as experts in SEO , GEO, and SEA . Contact us today for a free consultation to revolutionize your online marketing strategy. FAQ: Frequently Asked Questions about AI Max What is the difference between Performance Max and AI Max? Performance Max covers all Google channels, while AI Max is specifically developed for search ads. Is AI Max suitable for every company? AI Max is best suited for companies with clear conversion goals that have enough budget to provide the AI with enough data for learning. For smaller budgets or very specific niche markets, a classic Google Ads campaign or a targeted SEO strategy may be more sensible. How do I maintain control when so much is automated? Control is exercised through assets, conversion goals, and regular analysis reports. These provide transparency and show how the AI is optimizing. Can I exclude keywords? Yes, excluding keywords is an important best practice. It helps ensure the campaign does not only target users already searching for your brand but also reaches new potential customers.
LLM Content Focus: What ChatGPT, Perplexity, and Gemini Prefer
Aug 21, 2025

Nadine
Wolff
Category:
Artificial Intelligence

Standard search engine optimization was yesterday – today it's also about designing content in such a way that it can be found, understood, and integrated into responses by Large Language Models (LLMs) like ChatGPT, Perplexity, and Google Gemini. Being cited as a source in AI-generated results not only benefits brand awareness but often also provides valuable backlinks. However, each LLM has its own focus when it comes to selecting content. In this article, you'll learn how these three models work and how you can tailor your content to their preferences. An Overview of the Three LLMs Before we delve into specific tactics, it's worth taking a quick look at how the models function. Each LLM evaluates content according to its own criteria. ChatGPT scores highly with well-structured explanations, Perplexity prioritizes recency and sources, and Gemini uses strong signals from the Google Index and prefers structured and multimedia content. These differences determine which content you should prioritize. ChatGPT – Creative & Dialogical Content ChatGPT excels at reproducing content in a natural, human-sounding language. It favors text that is easy to read, provides clear explanations, and is organized in a logical structure. Preferred Content: Storytelling, illustrative examples, step-by-step explanations Style: dialogical, accessible, understandable to a wide audience Data Source: Mainly training data, with web access in the Pro version Success Factor: Evergreen content that is mentioned on many trustworthy sites has a better chance of being included in the model Perplexity – Research, Sources, Recency Perplexity is an LLM with integrated real-time web access. The unique aspect: It always shows sources and directly links to them. Preferred Content: Recent studies, statistics, professional articles, precise analyses Style: factual, evidence-based, concise Data Source: Live internet search + structured sources Success Factor: Clear source citations, publication date, author, imprint – and content that directly addresses the question posed Extra Tip: FAQ formats and How-To guides are particularly visible because Perplexity often presents answers in a Q&A structure Google Gemini – Multimodal & SEO-Driven Gemini is closely linked with the Google ecosystem and uses traditional search data to integrate content into AI responses. Additionally, it can combine text, image, video, and audio. Preferred Content: SEO-optimized articles, rich snippets, structured data (Schema.org) Style: informative, well-organized, with visual elements such as infographics or tables Data Source: Google Search Index + multimodal analysis Success Factor: Content that already performs well in organic Google ranking has a significantly better chance of appearing in Gemini Content Priorities in Direct Comparison There are significant differences between the models. ChatGPT prefers reader-friendly explanations, Perplexity demands recency and sources, and Gemini rewards SEO structure and media diversity. Use this matrix as a guide for your editorial plan. Criterion ChatGPT Perplexity Google Gemini Type of Content Explanatory texts, examples, storytelling Professional articles, data, primary sources SEO structured articles, media mix Recency more evergreen very high high, based on Google Index Sources indirect via training data direct, visible links Google signals, rich results, markup Format Prose, Q and A sections FAQ, How-to, tables, lists H2 H3 structure, Schema.org, Multimedia Language dialogical, accessible factual, precise informative, search-intention oriented Optimization Strategies for Each LLM Even though best practices overlap, focusing on the specific preferences of the models is worthwhile. This way, you can garner more mentions and links. Optimize for ChatGPT Begin each key section with the most important answer, followed by brief justifications and at least one example. Explain technical terms in your own words, add a concise definition, and link to additional internal pages if necessary. Structure is crucial. Use clear H2 and H3, frame frequent user questions as subheadings, and answer them directly in the first paragraph below. Add practical examples, checklists, and small sequences of steps. This increases the chance that passages will be used as complete answers. Optimize for Perplexity Build a clean source concept. Name primary sources, use quotes sparingly but precisely, and provide numbers with links and dates. Insert a brief summary with three to five key statements at the beginning of an article. Make the publication date, author, and company information clearly visible. Regularly update content. Maintain an FAQ block with real user questions and concise answers of 40 to 80 words. Include tables with important metrics. This increases the likelihood of being directly linked. Additionally, you can bundle in-depth resources and provide them in a resources section at the end. Optimize for Gemini Focus on proper on-page fundamentals. Optimize title and meta description, establish a clear heading hierarchy, and use Schema.org markup. Build internal links with descriptive anchor text to thematically related pages, such as guide articles or service pages. Create media to foster understanding, such as an infographic with process steps or a table with pros and cons. Pay attention to E-E-A-T signals. An author profile with qualifications, references, and contact information builds trust. Examples of Content Elements that LLMs Favor Short definition at the beginning, maximum two sentences, directly related to the question. Explanation section with a real-life example. Mini checklist with three to five points that makes a task doable. Table with criteria, such as comparison of methods, costs, or risks. FAQ section with three to seven real questions. These building blocks can be used in blog posts, service pages, and knowledge articles. In online shops, they can also function as supplementary guides on category pages. Common Mistakes That Prevent Mentions One of the most common mistakes is an unclear structure where users cannot find a direct answer at the beginning of a section immediately. A lack of sources or the use of outdated data also negatively impacts credibility. If a topic is too broadly covered on just a single page, relevance decreases along with the chance of being mentioned. Furthermore, the absence of publication date and author leads to less trust in the content. Likewise, a lack of internal linking can result in crucial context signals being missed, causing LLMs not to rate the content as particularly relevant. To avoid these hurdles, you should regularly review existing content, structure it carefully, and update it specifically. Conclusion Optimization for LLMs is not futuristic – it is already crucial to remain visible in the new search world. ChatGPT prefers easily understandable, creative, and well-explained content Perplexity relies on current, evidence-based, and source-supported content Gemini accesses SEO-strong, structured, and multimedia content The requirements of ChatGPT, Perplexity, and Gemini vary – but with the right strategy, you can excel in all three models. We support you in developing content that is found, mentioned, and linked not only by search engines but also by AI systems. Get in touch now. FAQ – Frequently Asked Questions About Content Focus How do I know if my content is mentioned in LLMs? With Perplexity, it's easy – sources are linked. With ChatGPT and Gemini, you can test this through targeted queries or monitor it with tracking tools. Do I need to optimize separately for each LLM? Yes, as the models have different focuses. However, there are overlaps, e.g., with clear structure and high source quality. How often should I update content? For Perplexity and Gemini regularly, as recency is a crucial factor. Evergreen content for ChatGPT should also be maintained.
The AIO & GEO Platforms Report 2025
Aug 13, 2025

Axel
Zawierucha
Category:
Artificial Intelligence

The digital marketing world is facing its biggest upheaval since the introduction of mobile-first indexing. Artificial intelligence, particularly in the form of generative answer machines, is redefining the rules of online visibility. In this comprehensive report, we analyze the landscape of AI Tools specifically developed for this new era, and provide you with a strategic compass to not only survive in the world of Generative Engine Optimization (GEO) but to win. Critical Assessment and Classification of AI Tools A critical assessment was conducted when integrating the new tools. Tools like Superlines, Rankscale.ai, Kai, ALLMO.ai, Quno, Finseo, Scrunch, SEOMonitor, Ayzeo, LLM Pulse (Generative Pulse), Deepserp, AI Peekaboo, and Evertune were identified as relevant GEO monitoring, content, or hybrid platforms and were integrated into the corresponding sections of the report. Other mentioned tools were deliberately excluded after careful review, as they do not align with the core focus of AI visibility analysis: Behamics is an e-commerce revenue platform, Advanced Web Ranking is a traditional rank tracker without explicit GEO functions, and 'Am I on AI' tools are AI content detectors (which check if a text was written by AI, not what an AI writes about a brand). This differentiation ensures that the report exclusively focuses on the most relevant and direct solutions for Generative Engine Optimization. The Paradigm Shift in Digital Marketing: Generative Engine Optimization The emergence of Generative Engine Optimization (GEO) represents the most significant paradigm shift in digital marketing since the introduction of mobile-first indexing. This report provides a comprehensive analysis of the GEO tool market, which is predicted to reach a volume of 7.3 billion USD by 2031. It outlines the bifurcation of the market into established SEO providers (SE Ranking, Semrush) and specialized startups (Profound, Otterly.ai), evaluates their capabilities, and provides a strategic framework for implementation. The key insight is that visibility in AI-generated answers is no longer optional; it is a critical, measurable, and optimizable component of modern brand strategy. Understanding the New Search Paradigm – Generative Engine Optimization (GEO) This section provides the strategic context by defining the transition from traditional SEO to optimization for AI-driven answer machines. It familiarizes readers with the new terminology, principles, and technical requirements necessary to compete in this evolving landscape. Defining the Post-SEO Landscape: From Search Engines to Answer Engines The fundamental shift in digital search behavior is transitioning from a list of links (Search Engine Results Pages, SERPs) to synthesized, conversational answers provided by generative AI models. This development fundamentally changes the customer journey and optimization goals. While traditional search engine optimization (SEO) focused on achieving clicks, Generative Engine Optimization (GEO) aims to receive citations in AI answers and influence the portrayal of one's brand within these answers. The current market landscape is characterized by a myriad of overlapping terms. For the clarity of this report, the following working definitions are established: AIO (Artificial Intelligence Optimization): This is the broadest term, often referring to making content machine-readable. AEO (Answer Engine Optimization): A more specific term that focuses on structuring content to answer direct questions. This targets featured snippets, "People Also Ask" boxes (PAA), and voice search. GEO (Generative Engine Optimization): This is the most current and relevant term. It encompasses the holistic practice of optimizing content and brand signals to appear in AI-generated answers on platforms like ChatGPT, Perplexity, and Google AI Overviews. This report will use GEO as the primary overarching term. This shift is not just theoretical. The data confirms the urgency and importance of the topic. As of March 2025, 13% of all Google searches already triggered an AI Overview – a 72% increase over the previous month. Moreover, Gartner predicts that the volume of traditional search engine usage will decrease by 25% by 2026 and by 50% or more by 2028, as users increasingly switch to AI assistants. The coexistence of multiple competing acronyms for a similar concept is a classic sign of an emerging, rapidly evolving market. This indicates not a marketing failure but rather evidence that the practice of AI optimization is solidifying faster than the industry can agree on a unified name. Core Principles of GEO: A Strategic Framework for AI Visibility The formalization of GEO as a concept in academic research provides a rigorous theoretical foundation. One of the key insights is that incorporating citations, quotations, and statistics can increase the visibility of a source in AI answers by more than 40%. The E-E-A-T principles of Google (Experience, Expertise, Authoritativeness, Trustworthiness) are of paramount importance for GEO. AI models are explicitly designed to prioritize credible sources. GEO also requires a shift from isolated keywords to building thematic authority around entities (people, products, concepts). A critical tactic is obtaining unlinked brand mentions (co-citations) in authoritative content. Metric Traditional SEO Generative Engine Optimization (GEO) Primary Objective Ranking on the SERP Being cited in the AI answer Core Unit of Optimization Website Brand/Entity Key Tactics Keyword optimization, Backlinking Semantic Structuring, E-E-A-T signals, Co-citations Primary KPIs Organic Traffic, Keyword Rankings Share of Voice, Mention Frequency, Sentiment Content Focus Long-form Articles Snippet-ready, Structured Answers Authority Signals Domain Authority, Backlinks Expert Citations, Data Quotes, Reviews The Technical Foundation: The Critical Role of AI-Friendly Schema and llms.txt Schema markup is the essential infrastructure that makes content readable for AI systems. It provides explicit context and helps AI differentiate facts from filler. Best Practices for AI-visible Schema: Using JSON-LD: The format preferred by Google. Prioritizing Key Schema Types: Organization, Product, FAQPage, HowTo, and Article are particularly effective. Mapping Visible, Real Content: Do not add schema for invisible content. Completeness and Accuracy: Fewer, but complete properties are better than many incomplete ones. The llms.txt file is emerging as the new standard – similar to the robots.txt – to provide clear guidelines to LLMs on using website content. It can be easily created with free online tools or WordPress plugins like AIOSEO . The robots.txt file, on the other hand, should be set up by experienced SEOs, as even small errors could, in the worst case, result in LLMs being completely excluded from access. Market Analysis and Future Outlook This section offers a macro perspective on the GEO market, analyzing its size, growth drivers, and future development. Market Landscape: Sizing the GEO Opportunity and Growth Forecasts The global market for Generative Engine Optimization (GEO) services was valued at 886 million USD in 2024 and is expected to grow to 7.318 billion USD by 2031, at a compound annual growth rate (CAGR) of 34.0%. This growth is driven by the rapid adoption of AI-powered search by users. The discrepancy between the growth rates of the GEO market (34.0% CAGR) and the traditional AI SEO Tools market (12.6% CAGR) signals market disruption. Budgets will likely be reallocated from traditional channels. Those not investing in GEO risk the erosion of their existing search visibility. Investments & Innovation: A Look at the GEO Startup Ecosystem The high growth potential has attracted significant venture capital and led to the emergence of specialized startups like Profound, Otterly.ai, and BrandBeacon. These companies are designed from the ground up for GEO and are driving innovations in areas critical for AI Search Monitoring and AI search tracking , such as real-time brand monitoring in LLMs and sentiment analysis of AI answers. The Future of Digital Discovery: Expert Perspectives Experts agree: The change is irreversible. One of the main challenges is measuring GEO successes. Traditional metrics are losing relevance. New KPIs like AI Search Visibility , Share of Voice, and citation frequency are becoming established. LLMs provide "opinions, not lists". If a brand is not among the first mentions, it is practically invisible. Comparative Analysis of AIO/GEO Visibility Platforms This is the core of the report: a detailed, feature-based comparison of the key AI Tools on the market. Evaluation Framework: Key Metrics and Capabilities To fairly evaluate the tools, we defined a framework with the following criteria: LLM & Platform Coverage: Which AI engines are monitored? Core Visibility Metrics: What is measured? (e.g., Share of Voice, Sentiment) Competitive Analysis: How well are competitors tracked? Data & Analytics Capabilities: How is the data processed? Action Orientation & Workflow: Does the tool assist in execution? User-Friendliness & Target Audience: Who is it designed for? Pricing & Value: What is the cost structure? The Established: How SEO Suites Adapt to the AI Era These players leverage their existing infrastructure to enter the GEO market. SE Ranking AI Visibility Tracker: An all-in-one platform that combines traditional SEO and GEO. Ideal for SEO professionals and agencies looking for an integrated solution. Semrush AIO: An enterprise solution focused on large-scale benchmarking and unmatched data depth. SEOMonitor: Specifically developed for agencies to optimize workflows with AI-powered tools. The Challengers: A Deep Dive into Dedicated GEO Monitoring Startups This category represents the "pure" GEO platforms, which are often more innovative and agile. Profound: A premium solution for businesses with real-time insights and advanced features like the "Conversation Explorer." Otterly.ai: An Austrian startup with a strong focus on brand safety and risk management. Peec AI: A specialized platform for global businesses with multilingual and cross-country support. Rankscale.ai: Offers an intuitive user interface and AI-generated suggestions for content optimization at the URL level. Scrunch: Focuses on optimizing the AI customer journey, including journey mapping and persona-based prompting. ... and many more, detailed in the comparison table. The Big Comparison Table of GEO Tools Tool Strategic Focus Covered LLMs Key Metrics Pricing Model Ideal User Profile SE Ranking Integrated SEO + GEO Google AIO, ChatGPT, Perplexity, Gemini Mentions, Links, SoV Subscription (part of SEO plans) SEO Professionals, Agencies, SMEs Semrush AIO Enterprise Monitoring Google AIO, ChatGPT, Claude, Perplexity, Gemini Mentions, Sentiment Subscription (Enterprise focus) Large Enterprises, E-commerce Brands SEOMonitor Agency Workflow Automation Google AIO, ChatGPT, Gemini AIO Visibility, GEO Tracking Subscription (from €99/month) SEO and Digital Marketing Agencies Profound Enterprise GEO Intelligence ChatGPT, Perplexity, Gemini, Copilot, Claude Mentions, Citations, SoV, Sentiment Premium Subscription ($499+) Enterprise Brands, Data-Driven Agencies Otterly.ai SME Brand Safety ChatGPT, Perplexity, Google AIO Rankings, Citations, Brand Safety Warnings Tiered Subscription ($29+) PR Teams, Brands in Sensitive Industries Peec AI Global GEO Analysis ChatGPT, Perplexity, Gemini, Claude, Grok Position Score, Sentiment Tiered Subscription (€90+) International Corporations, Global Agencies Rankscale.ai Actionable GEO Analysis ChatGPT, AIOs, Perplexity, etc. Rankings, Citations, Sentiment Affordable Subscription (from €20/month) SEOs seeking quick insights Scrunch AI Customer Journey Optimization Leading LLMs (incl. Grok, Claude) Sentiment, Competitive Position Unknown Agencies, Enterprise Brands Deepserp Technical GEO Audit ChatGPT, Gemini, etc. AI Crawl Behavior, Citations Subscription (from $99/month) Large Websites, Technical SEO Teams LLMrefs Freemium Visibility Key LLMs LLMrefs Score, Mentions Freemium ($0 / $79) Freelancers, Small Businesses The Specialists: Niche, Integrated, and Hybrid Platforms This category includes tools that have integrated GEO/AEO functionalities into their core offerings. Wix AI Visibility Overview: The first major CMS with an integrated tool for tracking AI visibility, an extremely convenient solution for millions of Wix users. Content & On-Page Optimization Platforms (Rankability, Surfer SEO, etc.): This group focuses on creating content that is structured and semantically rich enough to be cited by AI. PR-Focused Platforms (LLM Pulse): These solutions highlight which media and sources influence a brand's representation in LLMs. Strategic Implementation and Recommendations This final section translates the analysis into an actionable strategy. Choosing the Right GEO Platform: A Needs-Based Decision Matrix Selecting the right tool depends on your specific goals. User Profile Primary Goal Top Recommendation(s) Alternatives Enterprise Brand Manager Comprehensive Brand Monitoring Profound Semrush AIO, Peec AI SEO Agency Scalable Client Management SE Ranking SEOMonitor, Semrush SME/Startup Owner Cost-Effective Visibility Tracking Otterly.ai Rankscale.ai, LLMrefs Content Marketer/Strategist Creating AI-Optimized Content Rankability Surfer SEO, Finseo Technical SEO Monitoring AI Crawling Capabilities Deepserp ALLMO.ai Building a GEO-Centered Content Strategy: From Audit to Execution Step 1: Define Requirements & Test Tools: Set your goals and test a shortlist of tools. Step 2: Conduct Baseline Audit: Use a tool to measure your current AI visibility and identify gaps. Step 3: Integrate Analytics: Connect GEO data with web analytics (e.g., GA4) to measure ROI. Step 4: Implement Technical Foundations: Create AI-friendly schema and an llms.txt file. Step 5: Execute Content Strategy: Create structured, authoritative content that directly answers user queries. Step 6: Monitor, Iterate, and Report: Continuously track performance and refine your strategy. Concluding Analysis: Mastering Visibility on the AI Search Front The synthesis of the findings shows: The GEO tool market is dynamic and bifurcated, yet the underlying principles focus on E-E-A-T and structured data . The shift from search to answer engines is irreversible, making investments in this area a strategic necessity. The most successful approach will be a hybrid: combining in-depth monitoring features of specialized AI Tools with the optimization features of AEO-focused platforms. The winners in the next era of digital marketing will be those who master the art and science of being the most credible, citable, and machine-readable source of information in their field. Ready for the New Search Reality? Take advantage of the first-mover advantage in Generative Engine Optimization. We support you in making your brand visible in AI answers – with a well-founded GEO strategy, tool setup, and content optimization. Talk to our experts and secure your AI visibility of tomorrow!
Find SEO Keywords and Develop a Keyword Strategy
Jul 30, 2025

Julien
Moritz
Category:
SEO

Keywords have been an important foundation of search engine optimization from the beginning. But their role has changed, just as the way we use them has, not least due to the increasing dominance of artificial intelligence (AI). We are convinced that keywords are still very important, so this guide will teach you how to find keywords, what to look for when choosing keywords, which tools you can use, and how to optimally use keywords in SEO. What are keywords? In search engine optimization, we refer to keywords or key phrases as the terms or phrases that users enter into search engines in order to find answers, information, content, or products. We use these terms on websites in certain elements to increase the probability of achieving good positions in search results. In SEA (e.g., Google Paid as opposed to SEO = Google Organic), we bid on keywords to display ads for those terms in search results. To make the topic more illustrative, we explain all the points in this blog post using a specific example: a fictional bicycle online shop or bicycle store with a website, namely the topic "bicycle" or "buying a bicycle". The graphic shows a small selection of different relevant related keywords for this topic: Types of keywords We differentiate between different types of keywords. This distinction plays a role in the strategic direction of our content and the priority we give to those keywords. By length: Short Head & Long Tail Keywords In general, there are two different types of keywords that are defined by their length: Short Head Keywords are short, very general terms that generally have a very high search volume but also correspondingly high competition. The intention behind these keywords is not clear. For example: "bicycle" is a Short Head keyword, which can imply a search for information, such as (what types of bicycles are there?), as well as a purchase intention or even a search for images. Long Tail Keywords , on the other hand, are longer, specific phrases or questions. Depending on the topic, the search volume and competition behind them are significantly lower, and the intention is generally clearer. For example: "best e-bikes 2025" or "buy cheap used kids’ bicycle" or "how to patch a bicycle tire?" In addition, you can define " Mid Tail Keywords " that lie between these two types of keywords. By intention: Information, Navigation, or Transaction? Another important classification of keywords is the intention behind the search query: Informational keywords indicate a search for information. These can be the beginning of the user journey, the first step on the way to a purchase. For example: "trekking bike vs. city bike" or "what to consider when buying a bicycle?" Navigation Keywords point to the search for a specific brand, website, or product. For example: "Decathlon bicycles" or "Cube E-bikes". Transactional keywords show a purchase intention. Users are looking for an online shop or a local buying opportunity. For example: "buy trekking bike" or "order gravel bikes". In addition to these classifications, there are other characteristics of keywords, such as search volume (how often is the term searched per month?) or the "Keyword Difficulty" calculated by many SEO tools (the difficulty of ranking in top positions for this keyword in search results). Why are keywords important in search engine optimization? Keywords reveal which terms (and topics) are searched frequently. To improve visibility in search results for relevant keywords, we use these terms on the respective pages. This way, we show search engines that a page (URL) is relevant for a specific topic and the associated terms. Therefore, we research and analyze keywords as the basis of content optimization - to know how and what to optimize. What is the significance of keywords in the age of AI? Are keywords still important in the age of AI? Let’s take a look back and take a short journey through SEO history. The role and significance of keywords have constantly evolved over the last few decades: While about 20 years ago keywords had to be used in their exact form to rank well, Google's language understanding developed over the following years. Various grammatical forms, singular and plural, were recognized as identical terms. Similarly, the connection between synonyms and related terms was recognized - it became about semantics instead of exact keyword matching. We have not been thinking in pure keywords in search engine optimization for many years. Instead, we optimize on topic clusters, naturally supported by keywords. Keywords are still used as the basis, but the context behind them has become significantly more important. A sign of this is that Google often displays websites optimized for a synonym for search queries - Google recognizes that terms mean the same thing. In addition to the clusters, another term has become important: the entity. An entity is a uniquely identifiable object or concept, such as a person, a place, a process. An indication of what Google identifies as an entity includes, among other things, the Knowledge Graph in search results or suggestions for topics or concepts. This can be seen in the example "pedelec": These connections between topics or terms also play a role in AI SEO . Keywords are far from dead; they still serve an important purpose even in the AI age: Keyword analysis helps you understand the terms your (potential) customers use - and to speak the same language as your target audience. Keywords, especially “long tail keywords” like questions, help you better understand the intentions and problems of your target audience and offer corresponding content. Keywords still underpin entities and topic clusters, only they are not used as strictly as they were years ago. It is more important to cover the topic comprehensively and satisfyingly for customers, to meet the intent, and to present yourself as an authority to users and search engines. Keyword analysis - Finding keywords in 3 steps How do we find keywords for our keyword and content strategy? Depending on resources and available time as well as the topic, a keyword analysis can be a complex task. The more general the topic, the more extensive the analysis and the more subtopics and keyword clusters are found. It is easier to research keywords for a single page and a very specific topic. 1. Brainstorming Ideally, you are well-versed in your topic for which you want to find keywords. The first step is always a rough brainstorming session where you jot down all subtopics, terms, and questions related to your topic. With this, you have the first foundation for your analysis on which you can build. If you want to research keywords for an entire website, think of these terms, for example: Your brand name (depending on its popularity, there may be a relevant search volume behind a brand name) Your industry (e.g. bicycle online shop) Categories (e.g. e-bikes, city bikes, mountain bikes, road bikes, kids' bikes, bikes for seniors, accessories, etc.) Brands (e.g. Cube Ebike, Bergamont Bicycle) Products or services (specific product names of bestsellers, e.g. cube kathmandu hybrid pro 750) Frequently asked questions about products or categories (e.g. which bike for commuting) 2. Analyzing keywords on the website If you already have a website or an online shop, you should establish a status quo: for which terms are you already (well) found, and through which terms do you get clicks and impressions? To do this, analyze the terms you find in the performance report in Google Search Console, and for which you get many impressions or clicks and add them to your brainstorming list. If you have access to a professional SEO tool, you will find all keywords for which your domain has a position in the top 100 here. You can filter these keywords by search volume or positions and export this list. This way, you find out where there is already potential on which you can build optimization. Particularly interesting here are the so-called "threshold keywords" - keywords that are "on the verge" of a certain area. Usually, this refers to - depending on the definition - keywords between 11 and 20, which are just before the top 10, or keywords starting from position 4 or 6, which are on the verge of the top 5 or top 3. 3. Keyword research After you probably already have a long list of keywords, it's time to research additional terms. A wide variety of sources and tools are available for this. If you want to conduct particularly comprehensive research, you can use all tools, but you can also focus on just a few if you have identified a sufficient number of keywords through these. Tools for keyword research The bad news first: Good keyword tools that you can work with effectively are always chargeable. Free versions generally limit the number of results or queries per day. You have to enter each keyword individually, instead of analyzing a list of terms, and the filter and export options are mostly restricted or unavailable. Professional SEO tools usually come with a keyword tool where you get important information for your entered terms, such as monthly search volume, as well as related terms that you can cluster and further analyze. Some of the most well-known tools include: Semrush, Ahrefs, Sistrix, Surfer SEO, or the KWfinder from Mangools. Free, limited options are available, for example, through Ubersuggest or Answerthepublic. Both tools offer a paid version as well as free research options. A unique feature of answerthepublic is the focus on long tail keywords. If you run ads on Google Ads, you can use the Google Keyword Planner, which gives you search volume, CPC, and competition for each keyword. Further sources to find keywords Besides the classic keyword tools, there are a whole series of other sources available to find keywords. These include: Google Suggest: Terms suggested by Google when entering terms in the search box Google related terms under search results Google Trends: Development of interest over time, but only for frequently searched terms Questions in Google search results Social Media Hashtags on relevant platforms such as Instagram, TikTok, Pinterest, etc. YouTube Autosuggest: Suggested terms when entering a word in the search Topics on platforms with user-generated content such as Reddit Thesaurus like Woxikon All these screenshots are included, check if you want to use all of them or just a few and how you want to present them: Google Suggest: Google Trends: YouTube Suggest: Woxikon: Google Questions: People also search for in Google: Keyword Gaps A very valuable analysis is the so-called "Keyword Gap Analysis", where you compare your website or online shop with one or more relevant competitors. Here, not only direct competitors are relevant (in our example other bicycle online shops), but all websites that are present for your topic in the search results (for example, an online magazine about bicycles). Using a professional keyword tool, the domains can be compared, and you receive a list of keywords for which your competitors are already ranking in the search results, but your domain is not yet. From this, opportunities arise to expand the website or shop, for example, to include new subcategories or new blog posts. Create a keyword strategy What do you do now with the many keywords you have found through the various sources? The first step is always to cluster these keywords thematically to get a better overview. These clusters can, for example, be oriented along the structure in your online shop and the type of page: Homepage Keywords (most general terms and brand, e.g. bicycle online shop) Categories (e.g. mountain bikes, road bikes, kids' bikes, etc.) Subcategories (e.g. 16-inch bikes, dirt bikes, folding bikes, etc.) Product keywords (e.g. Cube Agree c62, Bergamot Sponsor Tour S, etc.) Guide keywords (e.g. what to consider when buying a bike, what is a trekking bike, adjusting hub gears, etc.) FAQ -Keywords (e.g. what frame size for a bike, how long does an e-bike battery last, etc.) What makes a good keyword? Out of the large number of keywords you have researched and clustered, you will not be able to use all of them. Therefore, it is important to evaluate and prioritize these keywords. Various criteria are important for this, and prioritization also depends on your strategic goals and your industry: Relevance: all keywords must be relevant to your business (if you don’t sell road bikes, the keyword "buy road bike" is irrelevant) Search Volume: a higher search volume means a higher probability of generating traffic. However, for niche industries, search volume is not the decisive factor. A more suitable keyword with a lower search volume can be more valuable than a more general one with high search volume. The goal is to find the optimal balance. The intention behind the keyword: The intent behind the search query must always match what users find on the page. Keywords indicating purchase intent (e.g., "buy mountain bike") are less important for a pure online magazine than informational keywords. Conversion probability: For an online shop, keywords with a relation to purchases are particularly valuable, as they are more likely to lead to a conversion than informational keywords. The more specific the term (for example, a product name), the higher the chance that someone will buy. Keywords for blog posts usually do not detract from this point. The competitive situation: Keywords with high competition (for which many large domains rank that have been on the market for a long time and have built up trust and authority) can be too big a challenge for new domains or smaller websites at first. It can take years to rank well for highly competitive keywords. Instead, keywords with less competition can achieve quicker results. Possible existing rankings: It is generally easier to improve keywords for which there is already a ranking, than to be found for brand new keywords. This is particularly true for keywords in threshold positions. Summary: a good keyword Has high relevance for your business Has relevant search volume Covers the appropriate intention Has a high conversion probability Has low to medium competition May already have existing rankings that can be improved. There is no blueprint for assessing and prioritizing keywords and creating a keyword strategy that can be applied to every industry and company. It requires both experience and a general company and marketing strategy to which the keyword strategy is adapted. We can therefore only give you suggestions at this point but are happy to support you in creating a professional keyword strategy! Keyword Mapping The most important step in the keyword strategy is creating a so-called keyword mapping. This involves assigning keywords to target pages (either existing pages or newly planned pages). It’s important: For each important keyword, exactly one page is defined that should rank for it Each important page is assigned a keyword set of 1-2 main keywords and several secondary keywords This setup prevents multiple pages from competing for the same keyword (the result can be that neither page ranks if Google is unsure which is the more relevant page - the so-called “cannibalization”). You can also identify which important keywords still lack a suitable target page. Based on this keyword mapping, you can then plan and optimize your content. Content Strategy Once we have assigned the keywords to the appropriate, existing, or new pages, we can create an editorial plan from them, in which all topics are included, for which we need to optimize pages or create new texts. For example, with bicycles, this could include these texts and pages: Children’s bicycles : the category page for children is supplemented with a guide, a product comparison, and buying advice. Buy bicycle under 500 dollars : either a category page or a filter page with the appropriate criteria or a blog post with a product comparison Best bike for beginners : blog post with a product comparison and buying advice Buy a bike with hub gears : the appropriate category page is supplemented with relevant content (comparison of hub gears and derailleur gears, advantages of hub gears, most popular bikes with hub gears, etc.) What to consider when buying a bicycle : guide article for the blog with an overview of different types of bicycles, criteria for purchase (usage, size, features, etc.), online purchase vs. local purchase, etc. Buy bicycle in #city : landing page of the own local shop with an overview of the assortment, address, maps, opening hours, etc. Using keywords The use of keywords could fill another long article, so we would like to give you an overview of the most important elements on a website or in an online shop where you should use keywords: In the metadata (page title/title tag and meta description/page description) In the headlines In the text In filenames of images and image descriptions In internal links Caution: Avoid keyword stuffing! One of the biggest mistakes in using your researched keywords is repeating the same keywords too often. If your text sounds unnatural because you've stuffed it with too many identical or similar terms, no one will want to read it, and search engines will likely evaluate it as "keyword stuffing." Make sure you always write content for users. Use synonyms and related terms and avoid pure lists of keywords or even grammatically incorrect use of terms just to cover a keyword exactly. The focus is on providing relevant, helpful content for your target audience, using relevant keywords naturally! SEO vs. SEA Keywords Effective SEO keyword strategies also benefit paid channels. Many synergies can be created between SEO and SEA (especially Google Ads), for example, by using the same keywords for organic optimization and paid ads and occupying two positions in search results with your website: one paid and one unpaid. A keyword and content strategy for search engine optimization can also be used in the SEA area SEA can also be used nicely for A/B testing ad texts and CTAs (call-to-actions) that can be used in the metadata (page title and meta description). Also, optimizing page content and using the most searched keywords helps for SEA - for example, to improve the quality score and pay low click prices. To gain the greatest benefit, the insights of both channels can be combined - which is why a company greatly benefits when SEO and SEA are in one hand. Incidentally, at internetwarriors, we have both long-standing SEO and SEA expertise in the team! A tailor-made keyword & content strategy for you Finding the right keywords and deriving an effective strategy that matches your business goals can be a challenge. The SEO experts at internetwarriors are happy to assist you. We analyze your website, your industry, and your potentials, research the appropriate keywords, and create an individual content strategy for you. We are also happy to help you optimize the content or create new content or train you and your team on how to use keywords effectively - even in the age of AI.
Server-Side Tracking - An Overview
Jul 16, 2025

Halid
Osmaev
Category:
Web Analytics

Server Side Tracking is the new standard. A significant advantage is the control provided over the data flow, especially user data. In this article, we discuss Server Side Tracking using Google Tag Manager as an example and review its benefits and which user data is sent. But first, the important question: What is Server Side Tracking? In short: Server Side Tracking is a data collection method where the tracking information is processed not in the browser but directly through the server of the website operator and forwarded to analysis or marketing tools afterward. The traditional tracking method is Client Side Tracking (CTS) , where a code snippet is embedded in the page, for example, via the Google Tag Manager. This sends event data directly to third-party services like Google Analytics 4, Meta Ads, etc. However, control over the sent user data (IP address, demographic data, etc.) is limited to the adjustments offered by the tool. Additionally, a third-party cookie is usually set, resulting in loss of data amount and quality. Figure 1: Comparison of Client-side and Server-side tagging With Server Side Tracking (SST) , all data is first sent to a private server, where, for example, the Server Side Google Tag Manager is running. This ensures that there is no undesired data transfer occurring on the website with the users. This transfer happens only in the Google Tag Manager Server Side. However, this can then be adapted to a data protection-compliant standard by clear insight into the data and further configuration options like transformers . Server Side Tracking vs. Client Side Tracking The traditional Client Side Tracking (CST) is still widespread but increasingly reaching its limits. In CST, tracking scripts are executed directly in the user’s browser, sending data like page views, clicks, or conversions to third-party tools. However, this approach is very susceptible to modern tracking protection measures such as ad blockers, VPNs, intelligent tracking prevention (ITP) in iOS/Safari, and various data protection regulations. In contrast, Server Side Tracking (SST) uses a different approach: Tracking data is no longer sent directly from the browser to external tools but first to the server. This acts as a proxy or central data hub through which all tracking requests run. The server request is treated similarly to an API request and is thus less vulnerable to blocking . Additionally, all data processing takes place within one's infrastructure , significantly reducing the risk when dealing with data protection authorities. Another difference lies in the use of cookies: While Client Side Tracking relies on third-party cookies – which are increasingly blocked by browsers – Server Side Tracking prefers first-party cookies , considered more trustworthy and stable. Why is Server Side Tracking now standard? While Client Side Tracking is increasingly losing its effectiveness due to growing restrictions, Server Side Tracking offers a future-proof, high-performance, and privacy-friendly alternative – with significantly higher data quality and control for companies. Overview of the benefits of Server Side Tracking: More data control : Unlike the specifications of external tracking tags, companies with SST retain full control over the collected data. Higher data quality : SST can often bypass ad blockers and tracking protection measures, typically leading to at least 12% more data. Performance advantages : Instead of addressing many individual tracking tools directly from the browser, only one server is contacted – conserving resources and improving website load time. Data protection compliance : By processing exclusively within their server structure, companies can better respond to legal requirements. Server Side Tracking and Data Protection Regulations Server Side Tracking offers not only technical advantages but also a significantly better basis concerning data protection laws. The main legal regulations in the European area are the GDPR, the TTDSG, and the EU-USA Data Privacy Framework. An overview: General Data Protection Regulation (GDPR) The General Data Protection Regulation ( GDPR ) mandates that personal data – which can be traced back to a real person, such as name, email address, or IP address – may only be collected and processed with the explicit consent of the users (e.g., through a cookie banner). It has been applicable in all EU member states since May 25, 2018, forming the central legal framework for handling personal data in the European area. The GDPR requires companies to inform transparently which data is collected for what purpose and how long it will be stored. Additionally, users must be able to object to processing or revoke consent at any time. For tracking, this means: No data may be collected or shared with third parties without clear and voluntary consent – even if the technology allows it. Violations of the GDPR can result in hefty fines. Server Side Tracking offers the advantage that data collection, storage, and sharing can be centrally controlled and better documented – facilitating GDPR-compliant implementation. Telecommunications-Telemedia Data Protection Act (TTDSG) The TTDSG (Telecommunications-Telemedia Data Protection Act) supplements the GDPR specifically for online services and stipulates that no arbitrary user data , especially through cookies or similar technologies, may be stored or read without prior consent . The law came into force on December 1, 2021, merging central data protection requirements from the GDPR and the German Telemedia Act (TMG) as well as the Telecommunications Act (TKG). For online tracking, this means: Even setting a cookie that is not purely technically necessary requires active, informed consent from users, for example, through a consent banner. Tracking methods attempting to create user profiles without consent – even through technologies like fingerprinting – are prohibited under the TTDSG. This tightens the requirements for data-driven online marketing measures and underscores the necessity to make tracking privacy-compliant and transparent – something that is much better controlled with Server Side Tracking. EU-USA Data Privacy Framework Particularly relevant for international companies is the new EU-USA Data Privacy Framework , which facilitates transatlantic data transfer and has been in effect since summer 2023. Previously, it was problematic to send personal data to US services because US authorities had extensive access to it by law. The new agreement creates more legal certainty when US services like Google or Meta are used – but only if the services are certified under the new framework. These are just a few of the laws affecting tracking. Therefore, an understanding of user data is important. Conclusion: Why does Server Side Tracking offer more data protection compliance? Server Side Tracking allows the entire data processing to initially run through one's server infrastructure. This means: Tracking occurs not directly with the users but only after explicit consent and under full control of data processing on one's server. This allows the requirements of data protection laws to be better implemented, such as targeted anonymization , pseudonymization , or restriction of data sharing with third parties . Overall, Server Side Tracking enables a more data protection-compliant handling of user data, allowing companies to maintain oversight and control – which is essential under the current regulatory framework. What user data is sent with Server Side Tracking? The good news: Only the absolute minimum. What does this mean? Using Google Tag Manager as an example: When an event on the page, like a click, is triggered, an HTTP request is sent to the Server Side Google Tag Manager . Naturally, HTTP header information is sent along. This includes, among others: Time IP address Page URL Approximate location (by IP address) Operating system Browser Resolution Device Additionally, there are other parameters specifically related to configuration. Detailed information can be found in the documentation at [ https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers ]. There are also parameters automatically captured by Google Tag for campaign optimization, which include: utm_source utm_medium utm_campaign utm_content utm_term and the Click ID It should also be ensured which data is user-defined sent with the Google Tag Manager configuration. In the Server Side Google Tag Manager, users can configure precisely through the use of transformers, which specific data should be forwarded in what form and which should be withheld. However, for a data-secure implementation, the conclusion should be: “ Track only as much data as needed .” The challenge is to limit tracking to what is necessary without incurring disadvantages. We set up correct Server Side Tracking for you The Internetwarriors are a team of experts in various fields of online marketing. One of our main focuses is web analytics and Server-Side Tracking (SST). With extensive expertise and a profound understanding of the latest trends and technologies in digital analytics, we offer tailored solutions to optimize our clients' online presence. We are thus a valuable partner for you when you want to set up professional tracking that provides all the data you need for strategic decisions and monitoring your online marketing activities. Contact us now without obligation !
Why Usability Is So Important for Your Website
Jun 11, 2025

Nadine
Wolff
Category:
SEO

Website usability means that users can reach their goals without frustration, overthinking, or detours—whether their goal is to get information, make a purchase, download something, or get in touch. The easier and smoother your website is to use, the more likely visitors are to stay—and come back. Good usability not only saves time but also builds trust and improves the perception of your company or brand. Especially in the digital market, just a few seconds often determine whether someone stays or leaves. Therefore, it's crucial that your website is logically structured, easy to understand, and technically impeccable—for all users, on every device. In this article, we'll provide you with an overview of the fundamentals of website usability, explain why it's so important, and outline which aspects you should consider. What is Usability? Usability can be translated as "usability" and describes the ability to achieve the desired result without difficulty. The term "user-friendliness" is also often used. Optimal usability means that a product, service, or website fulfills its intended purpose precisely. On a website, this could mean a completed purchase, the provision of relevant information, strengthening your brand, etc., depending on the intention. Usability is a key factor in determining whether and to what extent visitors engage with your website content or move on to another site. It is an important factor in customer satisfaction and website quality. A usability analysis can help identify errors, weaknesses, and opportunities for improvement on the website. What does Usability encompass? Website usability includes all aspects that affect the user-friendliness of a website—how simple, understandable, efficient, and pleasant it is to use. Many factors interact to create a positive user experience. Specifically, they include: Navigation and Structure: Clear page layout (e.g., an easy-to-navigate menu, logical hierarchy), simple, logical navigation, breadcrumb navigation, and an easily searchable effective search function Layout and Design: A consistent layout across all pages, a visual hierarchy (important elements stand out), responsive design (works on all devices), and appropriate use of colors, font sizes, and spacing Content and Language: Clear, understandable language, relevant, up-to-date content, good readability (paragraphs, titles, lists), and accessibility (e.g., alt text, contrasts, keyboard accessibility) Interactivity and Feedback: Meaningful feedback (e.g., after clicks, forms), helpful and friendly error messages, and buttons and links that are clearly recognizable Loading Times and Performance: Fast loading times of pages and content and technical stability (no crashes, malfunctions) Accessibility: Support for screen readers, keyboard operability, good color contrasts, and scalable font sizes Trust and Security: SSL encryption, transparent data protection information, and a professional appearance (e.g., legal notice, contact information) Conversion Support: Clear calls to action (e.g., "Buy Now," "Learn More"), no distractions from the actual purpose of the page, and support for processes like forms or checkout What are the Goals of Website Usability? The goals of website usability are aimed at designing websites so they can be used effectively, efficiently, and satisfactorily by users. Interaction should be as intuitive as possible to lead to the desired outcome. Jakob Nielsen - one of the leading experts on usability, who has been studying the topic for decades - developed 5 key criteria of usability : 1. Learnability Goal: New users should be able to quickly understand and use the website. Navigation, structure, and functions must be intuitive . Example: A first-time visitor immediately understands where to find information or how to make a purchase. 2. Efficiency Goal: Experienced users should be able to complete their tasks quickly and effectively . Optimized workflows, fast loading times, and clear paths lead to goal achievement. Example: A repeat customer can reorder with just a few clicks. 3. Memorability Goal: Users who haven’t visited the website for a while should be able to easily use it again. This is helped by consistent design, familiar symbols, and a logical structure. Example: A user remembers how to find customer support even after weeks. 4. Errors (Error Prevention and Handling) Goal: The website should be as error-tolerant as possible and help users avoid or correct errors. This can be achieved through clear and understandable error messages, opportunities for correction, or well-thought-out forms. Example: An incomplete form clearly indicates what is missing without deleting all input. 5. Satisfaction Goal: Using the website should be a pleasant experience and generate positive feelings. An appealing design, useful content, and easy navigation lead to higher satisfaction among users. Example: A user-friendly interface with clear texts provides trust and enjoyment while using the site. How Can Usability Be Measured? Measuring usability means systematically evaluating the user-friendliness of a product or website using specific criteria, tasks, and user tests. While usability is partly subjective, it can be objectively measured if you use the right methods. Aspects of the above five listed criteria - learnability, efficiency, memorability, error prevention, and satisfaction - are often tested. Various methods are employed for measurement: Usability tests with users who have to perform specific tasks Questionnaires filled out by users after a test An expert examination based on specific criteria (does not replace user testing) Analysis of analytics and user behavior through web analysis tools, heatmaps, or mouse tracking A/B testing - different versions are tested against specific questions and data Why is Usability Important? There are numerous positive impacts of good website usability and just as many reasons why you should pay attention to good user-friendliness. It makes it easier for users to navigate your website, helps you stand out from the competition, and enhances the impression you leave on potential customers. Let's take a closer look at the individual reasons: Users Want to Be Guided A good website makes it easier for users to navigate. It allows them to quickly get to their desired goal (completing a purchase, downloading a document, finding the needed information) without thinking too much. The offered content should be self-explanatory in terms of the positive user experience. In fact, users don't want to think about how to do something on a website. They want to be guided and don't want to search laboriously for certain elements and "discover" the website on their own. They want to intuitively reach their desired goal as quickly as possible. The content offered needs to be self-explanatory for a positive user experience, and any interactive elements should be directly usable. Website Usability as a Competitive Factor The lack of willingness to stay on a website just because is partly due to the market situation. In the online world, a competitor is just one click away. Unlike a physical shopping street, there's no need to walk past a series of stores. The reality is usually different. Most companies don’t have the luxury of offering something so unique (services, physical products, digital products like apps) that users would endure a cumbersome website. Instead, well-prepared and easily accessible website content has the power to convince users to complete conversions on your site rather than a competitor's, making usability a crucial competitive factor. Your website needs to make a good impression from the first second. Many website owners are unaware that users often judge the entire company based on their first impression of the website. A poorly maintained website can negatively impact the entire company's image. Users might question whether the company is capable of offering good service if it can't even manage to keep basic things on the website in order. While this assumption is often incorrect, it illustrates the potential consequences of poor website usability. Usability is More Than Just Good Design The challenge in website design is that it's unlike any other medium because people often have different expectations for how websites are used versus how they're actually used. When website owners and designers sketch out a site, they have a certain image in mind. It's often assumed that users will study all the content thoroughly. But in reality, users often only skim through a page initially. They quickly scan the text and click on the first link that seems interesting or appears to be the closest to their searched goal. As a result, a large portion of the page might not be actively noticed by your users. How users read a webpage greatly depends on their goal in mind. They focus on words and phrases that match their personal interests, known as trigger words. If they don't find these on your website, it can still be beautifully designed and contain seemingly informative content, but it won’t hold their interest. Technical Performance is Part of Usability too Besides graphical design, technical preparation is a key aspect of usability. While large images, videos, interactive graphics, and other moving elements may look visually appealing, they significantly increase loading time. The problem is not all users have access to a fast internet connection. Ideally, a good load time is important for two main reasons: it reduces the chance that visitors lose patience and leave the website, and page load time is becoming an increasingly important ranking factor for search engines. Usability Engineering Usability engineering is a structured and systematic process for developing user-friendly systems, where the usability of a product—or, in this case, a website—is deliberately planned, tested, and improved throughout the development process. This means that usability should ideally be considered from the conception of a website or a relaunch. Good usability doesn't happen by chance; it's systematically planned and tested. SEO and Usability – A Strong Team SEO ensures the necessary flow of traffic through organic search results. But you shouldn't focus solely on generating traffic. Without usability, your visitors will quickly leave again, often without converting. If you only focus on usability, you will reach a significantly smaller group of visitors who may convert well, but due to the small number, may not generate enough revenue to financially sustain the company behind it. Choosing usability doesn't necessarily mean writing off SEO. If you think SEO is just about ranking first in Google's organic search results and driving traffic to the website, you're not thinking broadly enough. SEO measures aim not only for top placement in search results but also ensure that users can find their way around the site. This goal is shared with usability. Classic OnPage SEO measures include, for example: Clear page structure Logical navigation Breadcrumbs and HTML sitemaps Pagination Avoidance of 404 error pages Logical internal linking On closer inspection, it becomes clear that these points simultaneously significantly improve user navigation. So, there are indeed significant intersections between the two online marketing disciplines, which do not contradict each other. Good SEO specialists and user experience managers are aware of this close connection and ensure that all elements of the website complement each other effectively. Website Usability and Accessibility/Accessibility In the context of website usability, the topic of accessibility is also important. Both disciplines pursue the same goal of facilitating the use of websites, but they are approached from different perspectives: Usability aims to improve user-friendliness for as many people as possible. Accessibility seeks to make the website fully usable for people with disabilities (especially visual and hearing impairments). Numerous measures improve both web usability and accessibility (for example, a clear structure and understandable language or good contrasts). Accessibility also requires compliance with technical standards such as the WCAG (Web Content Accessibility Guidelines), including screen reader compatibility or alternative texts for images. Since the Accessibility Enhancement Act , you should assess (or have assessed) whether your website is not only user-friendly but also accessible. This is important to ensure that people with disabilities can use your website without restriction. Usability issues usually only become apparent when expected conversions don’t occur. Retrospective improvements can be expensive or, in extreme cases, unworkable. That's why you should focus on a good user experience at the launch or relaunch of your website. However, this doesn't mean that you should completely disregard usability in favor of SEO. Those who believe that SEO is only about ranking first in organic search results and driving traffic to the website are thinking too narrowly. In addition to good placement in organic search results, SEO measures also aim to ensure that users can navigate the website. This goal is shared by search engine optimization with usability.
Show more posts
Transparency in Google Ads: How to Properly Utilize Performance Max Channel Reporting
Oct 10, 2025

Josephine
Treuter
Category:
Search Engine Advertising

Google Ads is one of the most efficient ways to increase a company's reach and achieve targeted conversions. However, in times of AI and automation, the way campaigns are managed and evaluated is also changing. With the introduction of Performance Max campaigns, Google has created a new approach: all channels, from Search to YouTube to Shopping, are bundled into a single, fully automated campaign. This promises maximum efficiency, but at the same time makes it more difficult to trace through which channels the conversions are actually generated. For a long time, it was unclear which channel contributed what to the campaign's performance. Those who needed this information had to resort to technical scripts and complex workarounds - an effort that overwhelmed many teams. With the new Channel Performance Reporting, this changes fundamentally, allowing results to be evaluated per channel. In this article, we'll show you how to make the most of the new reporting, which best practices have already proven themselves, and how to make better decisions with more transparency. As an experienced Google Ads agency, we provide you with practical tips directly from everyday life at internetwarriors. The Essentials in Brief Performance Max bundles all Google channels into one campaign. The Channel Reporting now provides the necessary transparency. You can see how Search, Display, YouTube, Discover, Maps, and Gmail perform individually. The reports can be segmented by ad format, status, or targets like CPA or ROAS. The new reporting allows you to identify optimization potentials more quickly and control them more precisely. The status section helps with technical issues and offers recommendations for action. What Exactly Is a Performance Max Campaign? The Performance Max campaign , or PMax for short, is an automated campaign format in Google Ads available since 2021. It allows ads to be played simultaneously across multiple Google channels such as Search, Display, YouTube, Gmail, Discover, and Shopping, all in a single campaign. Unlike traditional campaigns, PMax relies on Google AI for ad delivery and optimization. Based on goals such as conversions or revenue, the system independently decides which ad to show to which user on which channel. For advertisers, this means less manual control and more focus on high-quality assets and strategic goal setting. With the new Channel Performance Reporting, it is now finally visible which channel contributes what to the overall performance, and this is an important step toward more transparency and control. Why Transparency in a PMax is so Important Performance Max campaigns offer many advantages: They bundle all Google channels into a single campaign, use AI for automated delivery, and promise maximum efficiency. However, this very automation brings a central challenge: a lack of transparency. It was long unclear through which channel a conversion actually occurred. This was a problem for anyone wanting to optimize their campaigns based on data. Without channel-specific insights, it is difficult to make informed decisions: Should more budget flow into YouTube or Search? Do video ads work better than text ads? Which audiences perform on which platforms? The answers to these questions are crucial for effective campaign management, and this is where the new Channel Performance Reporting comes in. It provides the necessary transparency to evaluate the performance of individual channels, identify optimization potentials, and strategically manage budgets. For agencies like internetwarriors, this is an important step to not only deliver results to clients but also develop transparent strategies. How to Find Channel Reporting in Your Google Ads Account The new Channel Performance Reporting for Performance Max is currently still in beta. This means that the feature is being rolled out gradually and may not be immediately available in every Google Ads account. The scope of the displayed data can also vary depending on the account, ranging from basic channel metrics to detailed conversion insights. If your account is already enabled, you can find the reporting directly in the Google Ads interface under: Campaign Overview → Select Performance Max Campaign → Insights → Channel Performance There, you will receive a detailed breakdown of important metrics such as impressions, clicks, conversions, costs, and ROAS. The view can be filtered by time period, device, or conversion goal, providing a valuable basis for data-driven optimizations. What Exactly Does the Channel Reporting Show You? The Channel Performance Reporting provides a structured overview of the performance of individual channels within a Performance Max campaign. It shows how the campaign is distributed across platforms like Search, Display, YouTube, Gmail, Discover, and Shopping, and what each channel's share of the achieved conversions is. This transparency allows an informed evaluation of budget distribution, identifies underperforming channels, and assists in prioritizing future investments. Additionally, the reporting offers extensive segmentation and filtering options. The data can be analyzed by key metrics such as Cost per Acquisition (CPA), Return on Ad Spend (ROAS), or Click-Through Rate (CTR). This provides a comprehensive view of the campaign's performance, both cross-channel and data-driven in a strategically usable way. What Can Be Learned from the Data The Channel Performance Reporting delivers far more than just numbers. It opens up new perspectives for the strategic management of Performance Max campaigns. By breaking down key figures like impressions, clicks, conversions, and costs by channel, it becomes visible which platforms are genuinely contributing to achieving targets and how the deployed budget is distributed. This data enables an informed assessment of the used ad formats, targeting methods, and device distribution. Conclusions can also be drawn regarding the customer journey and potential optimization potentials can be identified, for example, in the design of assets or budget allocation. For agencies like internetwarriors, this transparency is a valuable foundation for not only optimizing campaigns efficiently but also communicating transparently with clients. How to Optimize Your Campaigns with the New Insights The channel-specific data from the Channel Performance Reporting provides a valuable foundation for the strategic optimization of Performance Max campaigns. By analyzing individual channels, it becomes apparent which platforms work particularly efficiently, where wastage is occurring, and which ad formats achieve the best results. Based on this, budgets can be distributed more strategically, assets can be designed more precisely, and target groups can be addressed more diversely. Furthermore, the insights enable a more precise evaluation of the customer journey: Are users addressed via YouTube but convert only via Search? Such patterns can now be comprehended and incorporated into the campaign structure. The selection of conversion goals can also be newly assessed based on the data to further align campaign orientation with actual user behavior. Limitations and Pitfalls of Channel Reporting Even though the Channel Performance Reporting represents an important step towards transparency, current limitations and pitfalls should not be neglected. Since the feature is still in the beta phase, availability is not guaranteed across the board, and the scope of displayed data can vary from account to account. In some cases, only aggregated values are displayed, without deeper insights into individual ad formats or audiences. Moreover, it should be noted that Performance Max operates cross-channel, and the individual channels do not stand alone but work collectively. A channel with seemingly weak performance can nevertheless make an important contribution to conversion, for example, through early user engagement in the funnel. Therefore, interpreting the data requires a holistic understanding of the customer journey and shouldn't rely solely on individual metrics. Technical limitations such as incomplete conversion attribution, missing asset data, or limited segmentation options can also complicate analysis. Therefore, a combination of Channel Reporting, conversion tracking, and supplementary tools such as Google Analytics or server-side tracking is recommended for a sound evaluation. Conclusion: More Control, Better Decisions With the new Channel Performance Reporting, a decisive step toward transparency within Performance Max campaigns is taken. The ability to evaluate channel-specific data directly in the Google Ads interface provides a solid basis for strategic decisions and targeted optimizations. Even though the feature is still in the beta phase and not fully available in every account, it is already clear how valuable these insights are for modern campaign management. The combination of automation and data-driven control makes it possible to distribute budgets more efficiently, use assets more targetedly, and better understand the customer journey. For agencies like internetwarriors, this means: more clarity in analysis, better arguments in customer communication, and significantly increased effectiveness in digital marketing. As an experienced Google Ads agency, we help you harness the full potential of your Performance Max campaigns. We assist you not only with setting up and optimizing your campaigns, but also with the targeted use of the new Channel Performance Reporting. This way, you'll gain clear insights into the performance of individual channels, can distribute budgets sensibly, and make data-based decisions. With our expertise in AI-supported campaign management and cross-channel analysis, we ensure that your ads not only perform but are transparent and traceable. Get in touch with us!
VKU Marketing Experts 2025 – AI in Focus
Oct 8, 2025

Axel
Zawierucha
Category:
Inside Internet Warriors

On September 24, 2025, Berlin was the hotspot for marketing experts from public utilities. The VKU Marketing Experts Congress provided an excellent platform to discuss the industry's most pressing issues. This year's top topic: the unstoppable rise of artificial intelligence in marketing. As internetwarriors, we were there, represented by our experts Julien Moritz (SEO/GEO expert) and Axel Zawierucha (CEO), to share our knowledge and gain new insights. The transformation is now: AI as a game-changer The atmosphere at the congress was marked by a palpable sense of optimism. Numerous lectures and discussions made it clear that AI is no longer just a buzzword but a tangible tool that is revolutionizing marketing strategies. From personalized customer engagement to automated content creation and data-driven forecasts – the possibilities seem endless. However, with new opportunities come new challenges. One of the central questions that arose in many conversations was: How can companies remain visible in a digital landscape dominated by AI systems and language models (LLMs) and effectively reach their target audiences? Our workshop: Visibility in the age of AI We dedicated our interactive workshop to precisely this question. Under the title "Visibility in the AI Era: How to Position Your Business in New Systems," Julien Moritz and Axel Zawierucha provided practical insights and strategic advice. The interest was overwhelming. Intense discussions with participants made it clear that many companies are seeking guidance on how to prepare their content and data to be optimally captured and presented by AI-based search and recommendation systems. We demonstrated how a well-thought-out data strategy and content optimization for semantic searches can make a significant difference. The many exciting questions and the enthusiastic participation showed us that we struck a chord here. How we as GEO specialists can support Especially in a local context, geographic visibility is crucial. As GEO specialists, we help you strengthen your presence in local search systems and map applications – an important factor to be found even in AI-driven environments. With structured location data, local SEO, and targeted integration into semantic search systems, we ensure that your offerings appear where your target audience is searching – today and in the AI-driven future. Contact us!
DMEXCO 2025: CRM, AI and the Future of Search Engine Marketing
Sep 24, 2025

Axel
Zawierucha
Category:
Inside Internet Warriors

DMEXCO 2025 in Cologne was more than just a trade show for us at internetwarriors.de – it was a vibrant marketplace of ideas, a melting pot of innovations, and above all, a confirmation of the topics that move us and our customers every day. With a record attendance of over 40,000 participants and under the motto "Be Bold. Move Forward.", this year's leading trade fair for digital marketing sent a clear signal: The future belongs to the bold, the pioneers, and those who are ready to blaze new trails. In countless inspiring conversations with customers, partners, and industry colleagues, a common thread emerged for us, connecting the central challenges and opportunities of our time: the inseparable linkage of Customer Relationship Management (CRM), the revolution through Artificial Intelligence (AI), and the redefinition of campaign planning in the era of generative models. The foundation of successful performance campaigns: The CRM feedback loop A theme that repeatedly came to the forefront in our conversations at DMEXCO 2025 was the immense importance of deep integration of CRM systems in performance marketing campaigns. It's a realization as simple as it is crucial: those who want to successfully generate leads must not merely scratch the surface. The mere generation of contact information is only half the battle. The true value unfolds only when a seamless feedback loop between marketing and sales is established. This is where CRM comes into play. It is the centerpiece that consolidates all relevant information about a potential customer and tells us what actually became of a generated lead. Was the contact qualified? Did it lead to a sales conversation? Was a contract concluded? This feedback is pure gold for optimizing performance campaigns. Without this feedback, we operate blindly. We see which ads and keywords generate clicks and conversions, but we don't know which truly lead to revenue. From our extensive practical experience and the intensive discussions at the fair, we can make a clear recommendation. As an official implementation partner of Teamleader in Germany, we have gained deep insights into the capabilities of modern CRM systems. We are convinced that Teamleader unites all critical features to conduct business successfully. From the central contact database, deal tracking, and project management to time tracking and invoicing, the platform offers an all-in-one solution specifically tailored to the needs of agencies and service-oriented SMEs. Seamless integration enables precisely the valuable feedback loop essential for data-driven marketing. The discussions at DMEXCO showed that companies that have successfully closed this loop deploy their marketing budgets much more efficiently. They can target their campaigns on the channels and audiences that deliver the most valuable customers. In an era where digital advertising costs are steadily rising, and competition is becoming more intense, this data-driven precision is no longer a "nice-to-have," but an absolute must for sustainable success. Google in transition: The future of search in the age of AI Of course, the future of AI and Google, along with organic search, was one of the dominant topics in the Cologne exhibition halls. The era of purely keyword-based searches is coming to an end. Generative AI models and the increasing integration of AI into search engine result pages (SERPs) signal a paradigm shift. The question everyone is asking is: How will search change, and what does it mean for our SEO and SEA strategies? The keynotes and expert lectures at DMEXCO painted a clear picture: search will become more contextual, dialogue-oriented, and personalized. Users no longer just expect a list of links, but direct answers and solutions to their concerns. Google's "Search Generative Experience" (SGE) is just the beginning. The ability to understand complex queries and answer in full sentences will fundamentally change how we search for information. For us as an agency, this means we need to adapt our content strategies. It's no longer just about optimizing individual keywords but creating comprehensive thematic worlds that holistically answer users' questions. "Topical Authority" will become the new currency in SEO. We must become the experts in our niche and create content that offers real value both for users and AI-driven algorithms. At the same time, AI also opens new possibilities for paid search. Performance-Max campaigns are a good example of how Google uses AI to automate and optimize the display of ads across the entire Google network. The challenge for us marketers is to provide the AI with the right signals – and{
Marketing in the Age of AI: Welcome to the New Reality
Sep 5, 2025

Axel
Zawierucha
Category:
Artificial Intelligence

A specter is haunting the marketing world – the specter of artificial intelligence. But instead of spreading fear, it brings a wave of transformation that redefines the very foundations of our industry. Gone are the days when marketing relied solely on intuition, manual segmentation, and broad campaigns. Today, in 2025, we are in the midst of a revolution driven by algorithms, machine learning, and Large Language Models (LLMs). For us at internetwarriors, it's clear: AI is not a passing trend but the new operating system for successful marketing. But what does that mean exactly? What has really changed? How do you need to adjust your strategies to not just survive but thrive? And how is perhaps the most important component of all – user behavior – changing? This article is your comprehensive guide to marketing in the age of AI. We dive deep into the changes, show you practical strategies, illuminate new user behaviors with current research insights, and look beyond to see which future trends from the USA and Asia will soon be our reality. The new playing field: What AI has fundamentally changed in marketing Artificial intelligence is more than just another tool in your toolkit. It's the invisible hand that optimizes processes, provides insights, and enables interactions at a speed and precision that were pure science fiction just a few years ago. The core changes can be observed in four key areas: 1. Real-time hyper-personalization: Previously, personalization was addressing a customer by their name in an email. Today, personalization means presenting the user with exactly the content, product, or message that matches their current need – across all channels. AI algorithms analyze massive amounts of data from user behavior, purchase history, demographic information, and even contextual data (such as weather or location) in milliseconds. The result: Dynamic web content, personalized product recommendations in online stores, and individually tailored ads perceived not as intrusions but as relevant services. 2. Predictive analytics and data-driven forecasts: Marketing has long been reactive. We analyzed past campaigns to optimize future ones. Marketing AI reverses this principle. Predictive analytics models can predict with high probability which customers are most likely to churn (Customer Churn), which leads have the highest purchase probability (Predictive Lead Scoring), or which products will sell best next season. This foresight allows you to act proactively, distribute budgets more efficiently, and focus your resources on the most promising segments. 3. Automation of content creation and distribution: Generative AI has revolutionized content creation. Tools like ChatGPT, Jasper, and more advanced, industry-specific models can now create high-quality texts for blogs, social media posts, emails, or product descriptions. But it goes far beyond that: AI systems can also generate images, videos, and even music. For you as a marketer, this means a significant increase in efficiency. Routine tasks that used to take hours are now done in minutes. At the same time, AI enables the creation of content for A/B tests in countless variants and the automatic distribution through the right channels at the right time. 4. Efficiency through intelligent automation: Besides content creation, AI automates countless other marketing processes. From programmatic buying of ad space (Programmatic Advertising) to intelligent bidding strategies in Google Ads to automatic segmentation of target audiences – AI handles repetitive, data-intensive tasks. This not only results in massive time and cost savings but also minimizes human errors and continuously optimizes campaign performance based on data. The marketing strategy 2025: How to successfully navigate the AI era A new technological reality requires a new strategic approach. It's not enough to just introduce a few AI marketing tools. Your entire marketing strategy needs to be rethought in the context of AI. 1. From target groups to "Segment-of-One": Your radical personalization strategy Your central strategy should be hyper-personalization. The goal is no longer to reach a target group but to treat each individual customer as their own segment ("Segment-of-One"). Practical implementation: Invest in a robust Customer Data Platform (CDP) that centralizes all customer data in one place. Use AI-driven personalization engines for your website, online store, and email marketing. These systems dynamically adjust content based on each user's click behavior, dwelling time, and purchase history. 2. Conversational marketing: Dialogue as the new funnel Users no longer want to fill out forms or be stuck on hold. They expect immediate answers and a direct dialog. AI-powered chatbots and voice assistants provide the solution. Practical implementation: Implement an intelligent chatbot on your website that not only answers standard questions but also qualifies leads, books appointments, and guides users through the purchasing process. Train the bot with your corporate data to ensure accurate and brand-consistent responses. 3. Content strategy: Quality and AI optimization hand in hand In the age of AI content creation, the sheer volume of content will explode. To stand out, two things are crucial: first, exceptional, human-centered quality, and second, optimization for AI systems. Practical implementation: Use generative AI as a tool for ideation, draft creation, and text optimization for SEO. However, the final editing, strategic direction, and emotional depth must come from human experts. At the same time, structure your content (e.g., through Schema.org markup) for easy understanding and prominent placement in answers from AI search engines like Google’s Search Generative Experience (SGE). 4. SEO and AI: The symbiosis for your visibility SEO and AI are inextricably linked. Google's algorithms, particularly RankBrain and BERT, are deeply rooted in machine learning. The future of search lies in answering complex queries, not just matching keywords. Practical implementation: Focus on thematic authority (Topic Clusters) rather than individual keywords. Create comprehensive content that fully answers user questions. Use AI tools to analyze SERPs, identify content gaps, and optimize your content for semantic search. Global outlook: These AI trends from the USA & Asia are defining the future While we in Europe are beginning to fully harness the potential of AI, the USA and Asia serve as "future labs." Different regulations, a higher risk appetite, and a deep-rooted "mobile-first" culture accelerate the adoption of technologies that will soon dominate our market. Trend 1 from Asia: The "Super-App" ecosystem & Social Commerce 2.0 In Asia, especially China, with apps like WeChat or Alibaba , "Super-Apps" dominate daily life. In these closed ecosystems, digital life unfolds: chatting, shopping, paying, booking services. AI is the glue that enables a seamless, hyper-personalized customer journey within a single platform . Live-stream shopping on steroids: Forget QVC. In Asia, live streams are interactive events. AI tools analyze viewer comments in real-time to suggest products to the influencer. Group-buying through AI: AI identifies potential buyers with similar interests and connects them into groups to obtain better prices through joint purchases. Trend towards conversational commerce: WhatsApp and Instagram are increasingly moving towards this direction. The trend relentlessly moves toward Conversational Commerce . Your first crucial step into this future is to deploy an intelligent chatbot on your website that not only answers standard questions but also qualifies leads, books appointments, and guides users through the purchase process. Trend 2 from the USA: The autonomous marketing manager In the USA, the trend is moving from AI assistance to AI autonomy . So-called "agent-based systems" are AI-generated virtual managers that not only assist, but autonomously manage campaigns and make strategic decisions. The Autonomous Marketing Manager: Instead of saying, "Create 10 social media posts," the AI will decide whether to write blog articles, launch Google Ads, or start an email campaign on its own. It analyzes the market, target audience, and performance to make these decisions. What does this mean for you? This trend is technologically demanding but will radically change your role as a marketer. Your task will be to centralize your data infrastructure (e.g., with a Customer Data Platform). Only with clean, accessible data can you effectively leverage AI systems to their full potential. Trend 3 from the USA & Asia: AI influencers and the era of synthetic media Virtual, AI-generated influencers with millions of followers and contracts with global luxury brands are emerging. They are precursors to a revolution in content creation. The forebears of change: AI-generated influencers like these herald new possibilities for brands to easily and inexpensively create their synthetic personas, designed to be visually and characteristically tailored to match a brand. What does this mean for you? In a world where AI anticipation of needs reduces tolerance for irrelevance, marketers should focus on transparency and creativity. Instead of replacing actual people, AI avatars can be used as fantasy figures, futuristic ambassadors, or scalable, personalized video tutorials. Transforming the marketer: The new era of AI and marketing AI is changing the expectations and behavior of consumers. They now anticipate immediate responses and direct dialogues in customer interactions. In response to this evolving technological landscape, marketers must evolve their strategies. It is not enough to merely introduce a few AI marketing tools; the entire marketing strategy must be reimagined using AI as a partner rather than a threat. Develop data competence: You don't have to become a data scientist, but you must learn to interpret data to draw valuable insights. Focus on strategic planning: Instead of manually setting up A/B tests, your task will be to set strategic directions while the AI optimizes the methodologies to achieve those strategies. Master creativity and storytelling: In a world where sheer volume of AI-generated content is set to explode, what will make a difference is outstanding, human-centered quality and content optimized for AI systems. Prompt engineering as a new skill: The quality of a generative AI's output now heavily relies on the quality and creativity of the input and command structure you provide. Lifelong learning: In the AI-driven world, continual education becomes a necessity rather than an option, as the technologies and methodologies keep evolving. Conclusion: The future of marketing is a symbiosis with AI The ubiquitous presence of AI is inevitably shaping expectations and consumer behavior. AI is freeing us from repetitive, manual tasks, giving us the capacity to focus on areas where humans are irreplaceable. Success will belong to those who understand this new technology not as a threat but as an ally. The winners will be those who combine the analytical power, speed, and scalability of AI with the irreplaceable human qualities of creativity, strategy, and storytelling. At internetwarriors, we look forward to this future full of opportunities. We see AI as a partner that frees us from repetitive, manual tasks, allowing us to dedicate our capabilities to where humans make the difference.
AI Max for Search Campaigns - How AI is Changing Google Ads
Sep 3, 2025

Markus
Brook
Category:
Search Engine Advertising

Online marketing is constantly evolving, driven by technological innovations. A current example is the introduction of Google's AI Max campaigns. This campaign type is specifically designed for search campaigns and utilizes artificial intelligence to control ads more efficiently. Below, we explain what AI Max for search campaigns is, the benefits it offers, and the requirements it places on advertisers. Key Points AI Max is a new campaign feature in Google Ads that uses machine learning for automated ad placements and bidding. AI Max combines existing Google Ads features such as Broad Match, DSA, and automatically generated assets. The focus is on maximizing conversions and conversion values. AI Max combines traditional search campaigns with AI-driven bidding strategies. Automation reduces management effort but requires clear goals, data, and high-quality assets. Control is achieved through goal definitions and continuous monitoring of campaign performance. Introduction: What is AI Max? Google continually develops its advertising platform, increasingly relying on artificial intelligence. With AI Max for search campaigns , a new campaign feature is introduced specifically designed for Google search. AI Max uses machine learning to automatically control ads, adjust bids in real-time, and increase the likelihood of conversions. The goal is to reduce manual effort and enhance the efficiency of search campaigns. How AI Max Works Unlike traditional search campaigns, Google AI Max heavily relies on automation. Assets, including ad titles, descriptions, sitelinks, or extensions, are provided to the system. The AI combines these components independently and dynamically creates ads that optimally match the respective search query. Additionally, the system continuously analyzes user signals like location, search history, or interaction patterns. This data is used to identify relevant target audiences and optimize ads in real-time. This makes campaign management significantly more precise and faster than manually possible. 1. Keywordless Technology: Search Ads Without Classic Keywords A central element is the so-called “keywordless matching.” Instead of relying on exact or phrase match keywords, Google analyzes landing pages, existing assets, and user behavior with AI to serve appropriate search queries. This is reminiscent of Dynamic Search Ads functionality, but in an even more automated framework. 2. Text Automation with AI The automatically created assets are another building block in AI Max. Google dynamically creates ad texts based on the website, previous ads, and other available data. 3. Final URL Expansion With final URL expansion, Google may direct users to a different target page than originally set if the AI assumes a better conversion probability exists there. This feature is also based on known DSA campaign mechanics. Benefits of AI Max in Google Ads The introduction of AI Max offers several benefits for advertisers: Time Savings Through Automation : Manual adjustments of bids and ad texts are mostly eliminated. Higher Conversion Probability : Google itself states that AI Max can generate up to 14% more conversions on average. Extended Reach : Ads are no longer only triggered by classic keywords but can also cover additional relevant search queries. Transparency : New reporting features show how AI makes decisions and what adjustments were made automatically. Despite the advantages, AI Max also carries risks. Automation can lead to unexpected and sometimes uncontrollable results. For example, AI may serve ads for search terms that do not directly align with the brand core or product, leading to irrelevant traffic and reduced efficiency. Another risk is that performance heavily depends on the quality of the provided assets and the data foundation. If these are faulty or insufficient, the AI may draw incorrect conclusions and steer the campaign in the wrong direction. In the worst case, this could result in wasted marketing budgets without achieving the desired results. Especially for clients with limited budgets and insufficient conversions, we currently do not recommend using AI Max. Challenges and Limitations Reduced Manual Control : Many decisions are taken over by the AI, meaning fewer intervention possibilities. Dependence on Data Quality : The AI can only work effectively if high-quality assets and precise conversion goals are provided. Continuous Monitoring Required : Even automated campaigns must be regularly reviewed and adjusted to be successful over the long term. First Practical Insights: What Companies Achieve with AI Max AI Max is not just a theoretical concept but already delivers real results, as proven by two early case studies from the beta phase that Google itself presents. Both L’Oréal Chile and the Australian provider MyConnect used AI Max and were able to make their search campaigns significantly more efficient. L’Oréal Chile: Higher Conversion Rates at Lower Costs The cosmetics giant used AI Max specifically to identify new keyword potentials and increase the relevance of its ads. With success: the conversion rate doubled while the cost-per-conversion decreased by a whopping 31%. An example shows the potential: The campaigns suddenly targeted search queries like “what is the best cream for facial dark spots” – terms that would likely never have been covered with classic keyword strategies. AI Max thus helped to specifically address relevant long-tail intentions without manual setup. MyConnect: More Leads Through New Search Impulses The Australian company MyConnect was already using Broad Match and tROAS. Nonetheless, activating AI Max brought clear improvements: 16% more leads 13% lower costs per conversion 30% more conversions from novel search terms Particularly intriguing: the strong increase in so-called “net-new queries” – search queries previously not covered by the existing keywords or assets. Here lies the real added value of AI Max: it recognizes opportunities that were not visible before. Best Practices for Using AI Max For AI Max to be successfully used, companies should follow some principles: Provide High-Quality Assets – Diverse ad titles and descriptions make it easier for AI to optimize. Define Conversion Goals Clearly – The more precise the goals, the better the AI can control the campaign. Conduct Regular Analysis – Despite automation, controlling metrics like ROAS, CTR, and conversion rate remains important. Review Brand Keywords – It may be wise to exclude brand terms to reach new target groups instead of just serving existing search queries. Conclusion: Opportunities and Limits of AI Max AI Max for Search Campaigns is a step towards greater automation in Google Ads. Companies can benefit from this technology if they strategically prepare their campaigns, set clear goals, and regularly monitor the results. The AI does not replace a well-founded marketing strategy but complements it. When used correctly, AI Max can help use budgets more efficiently, reduce administrative effort, and enhance performance. If you want to discover how AI Max or other innovative approaches to Google Ads with AI can help your company, we are here for you as experts in SEO , GEO, and SEA . Contact us today for a free consultation to revolutionize your online marketing strategy. FAQ: Frequently Asked Questions about AI Max What is the difference between Performance Max and AI Max? Performance Max covers all Google channels, while AI Max is specifically developed for search ads. Is AI Max suitable for every company? AI Max is best suited for companies with clear conversion goals that have enough budget to provide the AI with enough data for learning. For smaller budgets or very specific niche markets, a classic Google Ads campaign or a targeted SEO strategy may be more sensible. How do I maintain control when so much is automated? Control is exercised through assets, conversion goals, and regular analysis reports. These provide transparency and show how the AI is optimizing. Can I exclude keywords? Yes, excluding keywords is an important best practice. It helps ensure the campaign does not only target users already searching for your brand but also reaches new potential customers.
LLM Content Focus: What ChatGPT, Perplexity, and Gemini Prefer
Aug 21, 2025

Nadine
Wolff
Category:
Artificial Intelligence

Standard search engine optimization was yesterday – today it's also about designing content in such a way that it can be found, understood, and integrated into responses by Large Language Models (LLMs) like ChatGPT, Perplexity, and Google Gemini. Being cited as a source in AI-generated results not only benefits brand awareness but often also provides valuable backlinks. However, each LLM has its own focus when it comes to selecting content. In this article, you'll learn how these three models work and how you can tailor your content to their preferences. An Overview of the Three LLMs Before we delve into specific tactics, it's worth taking a quick look at how the models function. Each LLM evaluates content according to its own criteria. ChatGPT scores highly with well-structured explanations, Perplexity prioritizes recency and sources, and Gemini uses strong signals from the Google Index and prefers structured and multimedia content. These differences determine which content you should prioritize. ChatGPT – Creative & Dialogical Content ChatGPT excels at reproducing content in a natural, human-sounding language. It favors text that is easy to read, provides clear explanations, and is organized in a logical structure. Preferred Content: Storytelling, illustrative examples, step-by-step explanations Style: dialogical, accessible, understandable to a wide audience Data Source: Mainly training data, with web access in the Pro version Success Factor: Evergreen content that is mentioned on many trustworthy sites has a better chance of being included in the model Perplexity – Research, Sources, Recency Perplexity is an LLM with integrated real-time web access. The unique aspect: It always shows sources and directly links to them. Preferred Content: Recent studies, statistics, professional articles, precise analyses Style: factual, evidence-based, concise Data Source: Live internet search + structured sources Success Factor: Clear source citations, publication date, author, imprint – and content that directly addresses the question posed Extra Tip: FAQ formats and How-To guides are particularly visible because Perplexity often presents answers in a Q&A structure Google Gemini – Multimodal & SEO-Driven Gemini is closely linked with the Google ecosystem and uses traditional search data to integrate content into AI responses. Additionally, it can combine text, image, video, and audio. Preferred Content: SEO-optimized articles, rich snippets, structured data (Schema.org) Style: informative, well-organized, with visual elements such as infographics or tables Data Source: Google Search Index + multimodal analysis Success Factor: Content that already performs well in organic Google ranking has a significantly better chance of appearing in Gemini Content Priorities in Direct Comparison There are significant differences between the models. ChatGPT prefers reader-friendly explanations, Perplexity demands recency and sources, and Gemini rewards SEO structure and media diversity. Use this matrix as a guide for your editorial plan. Criterion ChatGPT Perplexity Google Gemini Type of Content Explanatory texts, examples, storytelling Professional articles, data, primary sources SEO structured articles, media mix Recency more evergreen very high high, based on Google Index Sources indirect via training data direct, visible links Google signals, rich results, markup Format Prose, Q and A sections FAQ, How-to, tables, lists H2 H3 structure, Schema.org, Multimedia Language dialogical, accessible factual, precise informative, search-intention oriented Optimization Strategies for Each LLM Even though best practices overlap, focusing on the specific preferences of the models is worthwhile. This way, you can garner more mentions and links. Optimize for ChatGPT Begin each key section with the most important answer, followed by brief justifications and at least one example. Explain technical terms in your own words, add a concise definition, and link to additional internal pages if necessary. Structure is crucial. Use clear H2 and H3, frame frequent user questions as subheadings, and answer them directly in the first paragraph below. Add practical examples, checklists, and small sequences of steps. This increases the chance that passages will be used as complete answers. Optimize for Perplexity Build a clean source concept. Name primary sources, use quotes sparingly but precisely, and provide numbers with links and dates. Insert a brief summary with three to five key statements at the beginning of an article. Make the publication date, author, and company information clearly visible. Regularly update content. Maintain an FAQ block with real user questions and concise answers of 40 to 80 words. Include tables with important metrics. This increases the likelihood of being directly linked. Additionally, you can bundle in-depth resources and provide them in a resources section at the end. Optimize for Gemini Focus on proper on-page fundamentals. Optimize title and meta description, establish a clear heading hierarchy, and use Schema.org markup. Build internal links with descriptive anchor text to thematically related pages, such as guide articles or service pages. Create media to foster understanding, such as an infographic with process steps or a table with pros and cons. Pay attention to E-E-A-T signals. An author profile with qualifications, references, and contact information builds trust. Examples of Content Elements that LLMs Favor Short definition at the beginning, maximum two sentences, directly related to the question. Explanation section with a real-life example. Mini checklist with three to five points that makes a task doable. Table with criteria, such as comparison of methods, costs, or risks. FAQ section with three to seven real questions. These building blocks can be used in blog posts, service pages, and knowledge articles. In online shops, they can also function as supplementary guides on category pages. Common Mistakes That Prevent Mentions One of the most common mistakes is an unclear structure where users cannot find a direct answer at the beginning of a section immediately. A lack of sources or the use of outdated data also negatively impacts credibility. If a topic is too broadly covered on just a single page, relevance decreases along with the chance of being mentioned. Furthermore, the absence of publication date and author leads to less trust in the content. Likewise, a lack of internal linking can result in crucial context signals being missed, causing LLMs not to rate the content as particularly relevant. To avoid these hurdles, you should regularly review existing content, structure it carefully, and update it specifically. Conclusion Optimization for LLMs is not futuristic – it is already crucial to remain visible in the new search world. ChatGPT prefers easily understandable, creative, and well-explained content Perplexity relies on current, evidence-based, and source-supported content Gemini accesses SEO-strong, structured, and multimedia content The requirements of ChatGPT, Perplexity, and Gemini vary – but with the right strategy, you can excel in all three models. We support you in developing content that is found, mentioned, and linked not only by search engines but also by AI systems. Get in touch now. FAQ – Frequently Asked Questions About Content Focus How do I know if my content is mentioned in LLMs? With Perplexity, it's easy – sources are linked. With ChatGPT and Gemini, you can test this through targeted queries or monitor it with tracking tools. Do I need to optimize separately for each LLM? Yes, as the models have different focuses. However, there are overlaps, e.g., with clear structure and high source quality. How often should I update content? For Perplexity and Gemini regularly, as recency is a crucial factor. Evergreen content for ChatGPT should also be maintained.
The AIO & GEO Platforms Report 2025
Aug 13, 2025

Axel
Zawierucha
Category:
Artificial Intelligence

The digital marketing world is facing its biggest upheaval since the introduction of mobile-first indexing. Artificial intelligence, particularly in the form of generative answer machines, is redefining the rules of online visibility. In this comprehensive report, we analyze the landscape of AI Tools specifically developed for this new era, and provide you with a strategic compass to not only survive in the world of Generative Engine Optimization (GEO) but to win. Critical Assessment and Classification of AI Tools A critical assessment was conducted when integrating the new tools. Tools like Superlines, Rankscale.ai, Kai, ALLMO.ai, Quno, Finseo, Scrunch, SEOMonitor, Ayzeo, LLM Pulse (Generative Pulse), Deepserp, AI Peekaboo, and Evertune were identified as relevant GEO monitoring, content, or hybrid platforms and were integrated into the corresponding sections of the report. Other mentioned tools were deliberately excluded after careful review, as they do not align with the core focus of AI visibility analysis: Behamics is an e-commerce revenue platform, Advanced Web Ranking is a traditional rank tracker without explicit GEO functions, and 'Am I on AI' tools are AI content detectors (which check if a text was written by AI, not what an AI writes about a brand). This differentiation ensures that the report exclusively focuses on the most relevant and direct solutions for Generative Engine Optimization. The Paradigm Shift in Digital Marketing: Generative Engine Optimization The emergence of Generative Engine Optimization (GEO) represents the most significant paradigm shift in digital marketing since the introduction of mobile-first indexing. This report provides a comprehensive analysis of the GEO tool market, which is predicted to reach a volume of 7.3 billion USD by 2031. It outlines the bifurcation of the market into established SEO providers (SE Ranking, Semrush) and specialized startups (Profound, Otterly.ai), evaluates their capabilities, and provides a strategic framework for implementation. The key insight is that visibility in AI-generated answers is no longer optional; it is a critical, measurable, and optimizable component of modern brand strategy. Understanding the New Search Paradigm – Generative Engine Optimization (GEO) This section provides the strategic context by defining the transition from traditional SEO to optimization for AI-driven answer machines. It familiarizes readers with the new terminology, principles, and technical requirements necessary to compete in this evolving landscape. Defining the Post-SEO Landscape: From Search Engines to Answer Engines The fundamental shift in digital search behavior is transitioning from a list of links (Search Engine Results Pages, SERPs) to synthesized, conversational answers provided by generative AI models. This development fundamentally changes the customer journey and optimization goals. While traditional search engine optimization (SEO) focused on achieving clicks, Generative Engine Optimization (GEO) aims to receive citations in AI answers and influence the portrayal of one's brand within these answers. The current market landscape is characterized by a myriad of overlapping terms. For the clarity of this report, the following working definitions are established: AIO (Artificial Intelligence Optimization): This is the broadest term, often referring to making content machine-readable. AEO (Answer Engine Optimization): A more specific term that focuses on structuring content to answer direct questions. This targets featured snippets, "People Also Ask" boxes (PAA), and voice search. GEO (Generative Engine Optimization): This is the most current and relevant term. It encompasses the holistic practice of optimizing content and brand signals to appear in AI-generated answers on platforms like ChatGPT, Perplexity, and Google AI Overviews. This report will use GEO as the primary overarching term. This shift is not just theoretical. The data confirms the urgency and importance of the topic. As of March 2025, 13% of all Google searches already triggered an AI Overview – a 72% increase over the previous month. Moreover, Gartner predicts that the volume of traditional search engine usage will decrease by 25% by 2026 and by 50% or more by 2028, as users increasingly switch to AI assistants. The coexistence of multiple competing acronyms for a similar concept is a classic sign of an emerging, rapidly evolving market. This indicates not a marketing failure but rather evidence that the practice of AI optimization is solidifying faster than the industry can agree on a unified name. Core Principles of GEO: A Strategic Framework for AI Visibility The formalization of GEO as a concept in academic research provides a rigorous theoretical foundation. One of the key insights is that incorporating citations, quotations, and statistics can increase the visibility of a source in AI answers by more than 40%. The E-E-A-T principles of Google (Experience, Expertise, Authoritativeness, Trustworthiness) are of paramount importance for GEO. AI models are explicitly designed to prioritize credible sources. GEO also requires a shift from isolated keywords to building thematic authority around entities (people, products, concepts). A critical tactic is obtaining unlinked brand mentions (co-citations) in authoritative content. Metric Traditional SEO Generative Engine Optimization (GEO) Primary Objective Ranking on the SERP Being cited in the AI answer Core Unit of Optimization Website Brand/Entity Key Tactics Keyword optimization, Backlinking Semantic Structuring, E-E-A-T signals, Co-citations Primary KPIs Organic Traffic, Keyword Rankings Share of Voice, Mention Frequency, Sentiment Content Focus Long-form Articles Snippet-ready, Structured Answers Authority Signals Domain Authority, Backlinks Expert Citations, Data Quotes, Reviews The Technical Foundation: The Critical Role of AI-Friendly Schema and llms.txt Schema markup is the essential infrastructure that makes content readable for AI systems. It provides explicit context and helps AI differentiate facts from filler. Best Practices for AI-visible Schema: Using JSON-LD: The format preferred by Google. Prioritizing Key Schema Types: Organization, Product, FAQPage, HowTo, and Article are particularly effective. Mapping Visible, Real Content: Do not add schema for invisible content. Completeness and Accuracy: Fewer, but complete properties are better than many incomplete ones. The llms.txt file is emerging as the new standard – similar to the robots.txt – to provide clear guidelines to LLMs on using website content. It can be easily created with free online tools or WordPress plugins like AIOSEO . The robots.txt file, on the other hand, should be set up by experienced SEOs, as even small errors could, in the worst case, result in LLMs being completely excluded from access. Market Analysis and Future Outlook This section offers a macro perspective on the GEO market, analyzing its size, growth drivers, and future development. Market Landscape: Sizing the GEO Opportunity and Growth Forecasts The global market for Generative Engine Optimization (GEO) services was valued at 886 million USD in 2024 and is expected to grow to 7.318 billion USD by 2031, at a compound annual growth rate (CAGR) of 34.0%. This growth is driven by the rapid adoption of AI-powered search by users. The discrepancy between the growth rates of the GEO market (34.0% CAGR) and the traditional AI SEO Tools market (12.6% CAGR) signals market disruption. Budgets will likely be reallocated from traditional channels. Those not investing in GEO risk the erosion of their existing search visibility. Investments & Innovation: A Look at the GEO Startup Ecosystem The high growth potential has attracted significant venture capital and led to the emergence of specialized startups like Profound, Otterly.ai, and BrandBeacon. These companies are designed from the ground up for GEO and are driving innovations in areas critical for AI Search Monitoring and AI search tracking , such as real-time brand monitoring in LLMs and sentiment analysis of AI answers. The Future of Digital Discovery: Expert Perspectives Experts agree: The change is irreversible. One of the main challenges is measuring GEO successes. Traditional metrics are losing relevance. New KPIs like AI Search Visibility , Share of Voice, and citation frequency are becoming established. LLMs provide "opinions, not lists". If a brand is not among the first mentions, it is practically invisible. Comparative Analysis of AIO/GEO Visibility Platforms This is the core of the report: a detailed, feature-based comparison of the key AI Tools on the market. Evaluation Framework: Key Metrics and Capabilities To fairly evaluate the tools, we defined a framework with the following criteria: LLM & Platform Coverage: Which AI engines are monitored? Core Visibility Metrics: What is measured? (e.g., Share of Voice, Sentiment) Competitive Analysis: How well are competitors tracked? Data & Analytics Capabilities: How is the data processed? Action Orientation & Workflow: Does the tool assist in execution? User-Friendliness & Target Audience: Who is it designed for? Pricing & Value: What is the cost structure? The Established: How SEO Suites Adapt to the AI Era These players leverage their existing infrastructure to enter the GEO market. SE Ranking AI Visibility Tracker: An all-in-one platform that combines traditional SEO and GEO. Ideal for SEO professionals and agencies looking for an integrated solution. Semrush AIO: An enterprise solution focused on large-scale benchmarking and unmatched data depth. SEOMonitor: Specifically developed for agencies to optimize workflows with AI-powered tools. The Challengers: A Deep Dive into Dedicated GEO Monitoring Startups This category represents the "pure" GEO platforms, which are often more innovative and agile. Profound: A premium solution for businesses with real-time insights and advanced features like the "Conversation Explorer." Otterly.ai: An Austrian startup with a strong focus on brand safety and risk management. Peec AI: A specialized platform for global businesses with multilingual and cross-country support. Rankscale.ai: Offers an intuitive user interface and AI-generated suggestions for content optimization at the URL level. Scrunch: Focuses on optimizing the AI customer journey, including journey mapping and persona-based prompting. ... and many more, detailed in the comparison table. The Big Comparison Table of GEO Tools Tool Strategic Focus Covered LLMs Key Metrics Pricing Model Ideal User Profile SE Ranking Integrated SEO + GEO Google AIO, ChatGPT, Perplexity, Gemini Mentions, Links, SoV Subscription (part of SEO plans) SEO Professionals, Agencies, SMEs Semrush AIO Enterprise Monitoring Google AIO, ChatGPT, Claude, Perplexity, Gemini Mentions, Sentiment Subscription (Enterprise focus) Large Enterprises, E-commerce Brands SEOMonitor Agency Workflow Automation Google AIO, ChatGPT, Gemini AIO Visibility, GEO Tracking Subscription (from €99/month) SEO and Digital Marketing Agencies Profound Enterprise GEO Intelligence ChatGPT, Perplexity, Gemini, Copilot, Claude Mentions, Citations, SoV, Sentiment Premium Subscription ($499+) Enterprise Brands, Data-Driven Agencies Otterly.ai SME Brand Safety ChatGPT, Perplexity, Google AIO Rankings, Citations, Brand Safety Warnings Tiered Subscription ($29+) PR Teams, Brands in Sensitive Industries Peec AI Global GEO Analysis ChatGPT, Perplexity, Gemini, Claude, Grok Position Score, Sentiment Tiered Subscription (€90+) International Corporations, Global Agencies Rankscale.ai Actionable GEO Analysis ChatGPT, AIOs, Perplexity, etc. Rankings, Citations, Sentiment Affordable Subscription (from €20/month) SEOs seeking quick insights Scrunch AI Customer Journey Optimization Leading LLMs (incl. Grok, Claude) Sentiment, Competitive Position Unknown Agencies, Enterprise Brands Deepserp Technical GEO Audit ChatGPT, Gemini, etc. AI Crawl Behavior, Citations Subscription (from $99/month) Large Websites, Technical SEO Teams LLMrefs Freemium Visibility Key LLMs LLMrefs Score, Mentions Freemium ($0 / $79) Freelancers, Small Businesses The Specialists: Niche, Integrated, and Hybrid Platforms This category includes tools that have integrated GEO/AEO functionalities into their core offerings. Wix AI Visibility Overview: The first major CMS with an integrated tool for tracking AI visibility, an extremely convenient solution for millions of Wix users. Content & On-Page Optimization Platforms (Rankability, Surfer SEO, etc.): This group focuses on creating content that is structured and semantically rich enough to be cited by AI. PR-Focused Platforms (LLM Pulse): These solutions highlight which media and sources influence a brand's representation in LLMs. Strategic Implementation and Recommendations This final section translates the analysis into an actionable strategy. Choosing the Right GEO Platform: A Needs-Based Decision Matrix Selecting the right tool depends on your specific goals. User Profile Primary Goal Top Recommendation(s) Alternatives Enterprise Brand Manager Comprehensive Brand Monitoring Profound Semrush AIO, Peec AI SEO Agency Scalable Client Management SE Ranking SEOMonitor, Semrush SME/Startup Owner Cost-Effective Visibility Tracking Otterly.ai Rankscale.ai, LLMrefs Content Marketer/Strategist Creating AI-Optimized Content Rankability Surfer SEO, Finseo Technical SEO Monitoring AI Crawling Capabilities Deepserp ALLMO.ai Building a GEO-Centered Content Strategy: From Audit to Execution Step 1: Define Requirements & Test Tools: Set your goals and test a shortlist of tools. Step 2: Conduct Baseline Audit: Use a tool to measure your current AI visibility and identify gaps. Step 3: Integrate Analytics: Connect GEO data with web analytics (e.g., GA4) to measure ROI. Step 4: Implement Technical Foundations: Create AI-friendly schema and an llms.txt file. Step 5: Execute Content Strategy: Create structured, authoritative content that directly answers user queries. Step 6: Monitor, Iterate, and Report: Continuously track performance and refine your strategy. Concluding Analysis: Mastering Visibility on the AI Search Front The synthesis of the findings shows: The GEO tool market is dynamic and bifurcated, yet the underlying principles focus on E-E-A-T and structured data . The shift from search to answer engines is irreversible, making investments in this area a strategic necessity. The most successful approach will be a hybrid: combining in-depth monitoring features of specialized AI Tools with the optimization features of AEO-focused platforms. The winners in the next era of digital marketing will be those who master the art and science of being the most credible, citable, and machine-readable source of information in their field. Ready for the New Search Reality? Take advantage of the first-mover advantage in Generative Engine Optimization. We support you in making your brand visible in AI answers – with a well-founded GEO strategy, tool setup, and content optimization. Talk to our experts and secure your AI visibility of tomorrow!
Find SEO Keywords and Develop a Keyword Strategy
Jul 30, 2025

Julien
Moritz
Category:
SEO

Keywords have been an important foundation of search engine optimization from the beginning. But their role has changed, just as the way we use them has, not least due to the increasing dominance of artificial intelligence (AI). We are convinced that keywords are still very important, so this guide will teach you how to find keywords, what to look for when choosing keywords, which tools you can use, and how to optimally use keywords in SEO. What are keywords? In search engine optimization, we refer to keywords or key phrases as the terms or phrases that users enter into search engines in order to find answers, information, content, or products. We use these terms on websites in certain elements to increase the probability of achieving good positions in search results. In SEA (e.g., Google Paid as opposed to SEO = Google Organic), we bid on keywords to display ads for those terms in search results. To make the topic more illustrative, we explain all the points in this blog post using a specific example: a fictional bicycle online shop or bicycle store with a website, namely the topic "bicycle" or "buying a bicycle". The graphic shows a small selection of different relevant related keywords for this topic: Types of keywords We differentiate between different types of keywords. This distinction plays a role in the strategic direction of our content and the priority we give to those keywords. By length: Short Head & Long Tail Keywords In general, there are two different types of keywords that are defined by their length: Short Head Keywords are short, very general terms that generally have a very high search volume but also correspondingly high competition. The intention behind these keywords is not clear. For example: "bicycle" is a Short Head keyword, which can imply a search for information, such as (what types of bicycles are there?), as well as a purchase intention or even a search for images. Long Tail Keywords , on the other hand, are longer, specific phrases or questions. Depending on the topic, the search volume and competition behind them are significantly lower, and the intention is generally clearer. For example: "best e-bikes 2025" or "buy cheap used kids’ bicycle" or "how to patch a bicycle tire?" In addition, you can define " Mid Tail Keywords " that lie between these two types of keywords. By intention: Information, Navigation, or Transaction? Another important classification of keywords is the intention behind the search query: Informational keywords indicate a search for information. These can be the beginning of the user journey, the first step on the way to a purchase. For example: "trekking bike vs. city bike" or "what to consider when buying a bicycle?" Navigation Keywords point to the search for a specific brand, website, or product. For example: "Decathlon bicycles" or "Cube E-bikes". Transactional keywords show a purchase intention. Users are looking for an online shop or a local buying opportunity. For example: "buy trekking bike" or "order gravel bikes". In addition to these classifications, there are other characteristics of keywords, such as search volume (how often is the term searched per month?) or the "Keyword Difficulty" calculated by many SEO tools (the difficulty of ranking in top positions for this keyword in search results). Why are keywords important in search engine optimization? Keywords reveal which terms (and topics) are searched frequently. To improve visibility in search results for relevant keywords, we use these terms on the respective pages. This way, we show search engines that a page (URL) is relevant for a specific topic and the associated terms. Therefore, we research and analyze keywords as the basis of content optimization - to know how and what to optimize. What is the significance of keywords in the age of AI? Are keywords still important in the age of AI? Let’s take a look back and take a short journey through SEO history. The role and significance of keywords have constantly evolved over the last few decades: While about 20 years ago keywords had to be used in their exact form to rank well, Google's language understanding developed over the following years. Various grammatical forms, singular and plural, were recognized as identical terms. Similarly, the connection between synonyms and related terms was recognized - it became about semantics instead of exact keyword matching. We have not been thinking in pure keywords in search engine optimization for many years. Instead, we optimize on topic clusters, naturally supported by keywords. Keywords are still used as the basis, but the context behind them has become significantly more important. A sign of this is that Google often displays websites optimized for a synonym for search queries - Google recognizes that terms mean the same thing. In addition to the clusters, another term has become important: the entity. An entity is a uniquely identifiable object or concept, such as a person, a place, a process. An indication of what Google identifies as an entity includes, among other things, the Knowledge Graph in search results or suggestions for topics or concepts. This can be seen in the example "pedelec": These connections between topics or terms also play a role in AI SEO . Keywords are far from dead; they still serve an important purpose even in the AI age: Keyword analysis helps you understand the terms your (potential) customers use - and to speak the same language as your target audience. Keywords, especially “long tail keywords” like questions, help you better understand the intentions and problems of your target audience and offer corresponding content. Keywords still underpin entities and topic clusters, only they are not used as strictly as they were years ago. It is more important to cover the topic comprehensively and satisfyingly for customers, to meet the intent, and to present yourself as an authority to users and search engines. Keyword analysis - Finding keywords in 3 steps How do we find keywords for our keyword and content strategy? Depending on resources and available time as well as the topic, a keyword analysis can be a complex task. The more general the topic, the more extensive the analysis and the more subtopics and keyword clusters are found. It is easier to research keywords for a single page and a very specific topic. 1. Brainstorming Ideally, you are well-versed in your topic for which you want to find keywords. The first step is always a rough brainstorming session where you jot down all subtopics, terms, and questions related to your topic. With this, you have the first foundation for your analysis on which you can build. If you want to research keywords for an entire website, think of these terms, for example: Your brand name (depending on its popularity, there may be a relevant search volume behind a brand name) Your industry (e.g. bicycle online shop) Categories (e.g. e-bikes, city bikes, mountain bikes, road bikes, kids' bikes, bikes for seniors, accessories, etc.) Brands (e.g. Cube Ebike, Bergamont Bicycle) Products or services (specific product names of bestsellers, e.g. cube kathmandu hybrid pro 750) Frequently asked questions about products or categories (e.g. which bike for commuting) 2. Analyzing keywords on the website If you already have a website or an online shop, you should establish a status quo: for which terms are you already (well) found, and through which terms do you get clicks and impressions? To do this, analyze the terms you find in the performance report in Google Search Console, and for which you get many impressions or clicks and add them to your brainstorming list. If you have access to a professional SEO tool, you will find all keywords for which your domain has a position in the top 100 here. You can filter these keywords by search volume or positions and export this list. This way, you find out where there is already potential on which you can build optimization. Particularly interesting here are the so-called "threshold keywords" - keywords that are "on the verge" of a certain area. Usually, this refers to - depending on the definition - keywords between 11 and 20, which are just before the top 10, or keywords starting from position 4 or 6, which are on the verge of the top 5 or top 3. 3. Keyword research After you probably already have a long list of keywords, it's time to research additional terms. A wide variety of sources and tools are available for this. If you want to conduct particularly comprehensive research, you can use all tools, but you can also focus on just a few if you have identified a sufficient number of keywords through these. Tools for keyword research The bad news first: Good keyword tools that you can work with effectively are always chargeable. Free versions generally limit the number of results or queries per day. You have to enter each keyword individually, instead of analyzing a list of terms, and the filter and export options are mostly restricted or unavailable. Professional SEO tools usually come with a keyword tool where you get important information for your entered terms, such as monthly search volume, as well as related terms that you can cluster and further analyze. Some of the most well-known tools include: Semrush, Ahrefs, Sistrix, Surfer SEO, or the KWfinder from Mangools. Free, limited options are available, for example, through Ubersuggest or Answerthepublic. Both tools offer a paid version as well as free research options. A unique feature of answerthepublic is the focus on long tail keywords. If you run ads on Google Ads, you can use the Google Keyword Planner, which gives you search volume, CPC, and competition for each keyword. Further sources to find keywords Besides the classic keyword tools, there are a whole series of other sources available to find keywords. These include: Google Suggest: Terms suggested by Google when entering terms in the search box Google related terms under search results Google Trends: Development of interest over time, but only for frequently searched terms Questions in Google search results Social Media Hashtags on relevant platforms such as Instagram, TikTok, Pinterest, etc. YouTube Autosuggest: Suggested terms when entering a word in the search Topics on platforms with user-generated content such as Reddit Thesaurus like Woxikon All these screenshots are included, check if you want to use all of them or just a few and how you want to present them: Google Suggest: Google Trends: YouTube Suggest: Woxikon: Google Questions: People also search for in Google: Keyword Gaps A very valuable analysis is the so-called "Keyword Gap Analysis", where you compare your website or online shop with one or more relevant competitors. Here, not only direct competitors are relevant (in our example other bicycle online shops), but all websites that are present for your topic in the search results (for example, an online magazine about bicycles). Using a professional keyword tool, the domains can be compared, and you receive a list of keywords for which your competitors are already ranking in the search results, but your domain is not yet. From this, opportunities arise to expand the website or shop, for example, to include new subcategories or new blog posts. Create a keyword strategy What do you do now with the many keywords you have found through the various sources? The first step is always to cluster these keywords thematically to get a better overview. These clusters can, for example, be oriented along the structure in your online shop and the type of page: Homepage Keywords (most general terms and brand, e.g. bicycle online shop) Categories (e.g. mountain bikes, road bikes, kids' bikes, etc.) Subcategories (e.g. 16-inch bikes, dirt bikes, folding bikes, etc.) Product keywords (e.g. Cube Agree c62, Bergamot Sponsor Tour S, etc.) Guide keywords (e.g. what to consider when buying a bike, what is a trekking bike, adjusting hub gears, etc.) FAQ -Keywords (e.g. what frame size for a bike, how long does an e-bike battery last, etc.) What makes a good keyword? Out of the large number of keywords you have researched and clustered, you will not be able to use all of them. Therefore, it is important to evaluate and prioritize these keywords. Various criteria are important for this, and prioritization also depends on your strategic goals and your industry: Relevance: all keywords must be relevant to your business (if you don’t sell road bikes, the keyword "buy road bike" is irrelevant) Search Volume: a higher search volume means a higher probability of generating traffic. However, for niche industries, search volume is not the decisive factor. A more suitable keyword with a lower search volume can be more valuable than a more general one with high search volume. The goal is to find the optimal balance. The intention behind the keyword: The intent behind the search query must always match what users find on the page. Keywords indicating purchase intent (e.g., "buy mountain bike") are less important for a pure online magazine than informational keywords. Conversion probability: For an online shop, keywords with a relation to purchases are particularly valuable, as they are more likely to lead to a conversion than informational keywords. The more specific the term (for example, a product name), the higher the chance that someone will buy. Keywords for blog posts usually do not detract from this point. The competitive situation: Keywords with high competition (for which many large domains rank that have been on the market for a long time and have built up trust and authority) can be too big a challenge for new domains or smaller websites at first. It can take years to rank well for highly competitive keywords. Instead, keywords with less competition can achieve quicker results. Possible existing rankings: It is generally easier to improve keywords for which there is already a ranking, than to be found for brand new keywords. This is particularly true for keywords in threshold positions. Summary: a good keyword Has high relevance for your business Has relevant search volume Covers the appropriate intention Has a high conversion probability Has low to medium competition May already have existing rankings that can be improved. There is no blueprint for assessing and prioritizing keywords and creating a keyword strategy that can be applied to every industry and company. It requires both experience and a general company and marketing strategy to which the keyword strategy is adapted. We can therefore only give you suggestions at this point but are happy to support you in creating a professional keyword strategy! Keyword Mapping The most important step in the keyword strategy is creating a so-called keyword mapping. This involves assigning keywords to target pages (either existing pages or newly planned pages). It’s important: For each important keyword, exactly one page is defined that should rank for it Each important page is assigned a keyword set of 1-2 main keywords and several secondary keywords This setup prevents multiple pages from competing for the same keyword (the result can be that neither page ranks if Google is unsure which is the more relevant page - the so-called “cannibalization”). You can also identify which important keywords still lack a suitable target page. Based on this keyword mapping, you can then plan and optimize your content. Content Strategy Once we have assigned the keywords to the appropriate, existing, or new pages, we can create an editorial plan from them, in which all topics are included, for which we need to optimize pages or create new texts. For example, with bicycles, this could include these texts and pages: Children’s bicycles : the category page for children is supplemented with a guide, a product comparison, and buying advice. Buy bicycle under 500 dollars : either a category page or a filter page with the appropriate criteria or a blog post with a product comparison Best bike for beginners : blog post with a product comparison and buying advice Buy a bike with hub gears : the appropriate category page is supplemented with relevant content (comparison of hub gears and derailleur gears, advantages of hub gears, most popular bikes with hub gears, etc.) What to consider when buying a bicycle : guide article for the blog with an overview of different types of bicycles, criteria for purchase (usage, size, features, etc.), online purchase vs. local purchase, etc. Buy bicycle in #city : landing page of the own local shop with an overview of the assortment, address, maps, opening hours, etc. Using keywords The use of keywords could fill another long article, so we would like to give you an overview of the most important elements on a website or in an online shop where you should use keywords: In the metadata (page title/title tag and meta description/page description) In the headlines In the text In filenames of images and image descriptions In internal links Caution: Avoid keyword stuffing! One of the biggest mistakes in using your researched keywords is repeating the same keywords too often. If your text sounds unnatural because you've stuffed it with too many identical or similar terms, no one will want to read it, and search engines will likely evaluate it as "keyword stuffing." Make sure you always write content for users. Use synonyms and related terms and avoid pure lists of keywords or even grammatically incorrect use of terms just to cover a keyword exactly. The focus is on providing relevant, helpful content for your target audience, using relevant keywords naturally! SEO vs. SEA Keywords Effective SEO keyword strategies also benefit paid channels. Many synergies can be created between SEO and SEA (especially Google Ads), for example, by using the same keywords for organic optimization and paid ads and occupying two positions in search results with your website: one paid and one unpaid. A keyword and content strategy for search engine optimization can also be used in the SEA area SEA can also be used nicely for A/B testing ad texts and CTAs (call-to-actions) that can be used in the metadata (page title and meta description). Also, optimizing page content and using the most searched keywords helps for SEA - for example, to improve the quality score and pay low click prices. To gain the greatest benefit, the insights of both channels can be combined - which is why a company greatly benefits when SEO and SEA are in one hand. Incidentally, at internetwarriors, we have both long-standing SEO and SEA expertise in the team! A tailor-made keyword & content strategy for you Finding the right keywords and deriving an effective strategy that matches your business goals can be a challenge. The SEO experts at internetwarriors are happy to assist you. We analyze your website, your industry, and your potentials, research the appropriate keywords, and create an individual content strategy for you. We are also happy to help you optimize the content or create new content or train you and your team on how to use keywords effectively - even in the age of AI.
Server-Side Tracking - An Overview
Jul 16, 2025

Halid
Osmaev
Category:
Web Analytics

Server Side Tracking is the new standard. A significant advantage is the control provided over the data flow, especially user data. In this article, we discuss Server Side Tracking using Google Tag Manager as an example and review its benefits and which user data is sent. But first, the important question: What is Server Side Tracking? In short: Server Side Tracking is a data collection method where the tracking information is processed not in the browser but directly through the server of the website operator and forwarded to analysis or marketing tools afterward. The traditional tracking method is Client Side Tracking (CTS) , where a code snippet is embedded in the page, for example, via the Google Tag Manager. This sends event data directly to third-party services like Google Analytics 4, Meta Ads, etc. However, control over the sent user data (IP address, demographic data, etc.) is limited to the adjustments offered by the tool. Additionally, a third-party cookie is usually set, resulting in loss of data amount and quality. Figure 1: Comparison of Client-side and Server-side tagging With Server Side Tracking (SST) , all data is first sent to a private server, where, for example, the Server Side Google Tag Manager is running. This ensures that there is no undesired data transfer occurring on the website with the users. This transfer happens only in the Google Tag Manager Server Side. However, this can then be adapted to a data protection-compliant standard by clear insight into the data and further configuration options like transformers . Server Side Tracking vs. Client Side Tracking The traditional Client Side Tracking (CST) is still widespread but increasingly reaching its limits. In CST, tracking scripts are executed directly in the user’s browser, sending data like page views, clicks, or conversions to third-party tools. However, this approach is very susceptible to modern tracking protection measures such as ad blockers, VPNs, intelligent tracking prevention (ITP) in iOS/Safari, and various data protection regulations. In contrast, Server Side Tracking (SST) uses a different approach: Tracking data is no longer sent directly from the browser to external tools but first to the server. This acts as a proxy or central data hub through which all tracking requests run. The server request is treated similarly to an API request and is thus less vulnerable to blocking . Additionally, all data processing takes place within one's infrastructure , significantly reducing the risk when dealing with data protection authorities. Another difference lies in the use of cookies: While Client Side Tracking relies on third-party cookies – which are increasingly blocked by browsers – Server Side Tracking prefers first-party cookies , considered more trustworthy and stable. Why is Server Side Tracking now standard? While Client Side Tracking is increasingly losing its effectiveness due to growing restrictions, Server Side Tracking offers a future-proof, high-performance, and privacy-friendly alternative – with significantly higher data quality and control for companies. Overview of the benefits of Server Side Tracking: More data control : Unlike the specifications of external tracking tags, companies with SST retain full control over the collected data. Higher data quality : SST can often bypass ad blockers and tracking protection measures, typically leading to at least 12% more data. Performance advantages : Instead of addressing many individual tracking tools directly from the browser, only one server is contacted – conserving resources and improving website load time. Data protection compliance : By processing exclusively within their server structure, companies can better respond to legal requirements. Server Side Tracking and Data Protection Regulations Server Side Tracking offers not only technical advantages but also a significantly better basis concerning data protection laws. The main legal regulations in the European area are the GDPR, the TTDSG, and the EU-USA Data Privacy Framework. An overview: General Data Protection Regulation (GDPR) The General Data Protection Regulation ( GDPR ) mandates that personal data – which can be traced back to a real person, such as name, email address, or IP address – may only be collected and processed with the explicit consent of the users (e.g., through a cookie banner). It has been applicable in all EU member states since May 25, 2018, forming the central legal framework for handling personal data in the European area. The GDPR requires companies to inform transparently which data is collected for what purpose and how long it will be stored. Additionally, users must be able to object to processing or revoke consent at any time. For tracking, this means: No data may be collected or shared with third parties without clear and voluntary consent – even if the technology allows it. Violations of the GDPR can result in hefty fines. Server Side Tracking offers the advantage that data collection, storage, and sharing can be centrally controlled and better documented – facilitating GDPR-compliant implementation. Telecommunications-Telemedia Data Protection Act (TTDSG) The TTDSG (Telecommunications-Telemedia Data Protection Act) supplements the GDPR specifically for online services and stipulates that no arbitrary user data , especially through cookies or similar technologies, may be stored or read without prior consent . The law came into force on December 1, 2021, merging central data protection requirements from the GDPR and the German Telemedia Act (TMG) as well as the Telecommunications Act (TKG). For online tracking, this means: Even setting a cookie that is not purely technically necessary requires active, informed consent from users, for example, through a consent banner. Tracking methods attempting to create user profiles without consent – even through technologies like fingerprinting – are prohibited under the TTDSG. This tightens the requirements for data-driven online marketing measures and underscores the necessity to make tracking privacy-compliant and transparent – something that is much better controlled with Server Side Tracking. EU-USA Data Privacy Framework Particularly relevant for international companies is the new EU-USA Data Privacy Framework , which facilitates transatlantic data transfer and has been in effect since summer 2023. Previously, it was problematic to send personal data to US services because US authorities had extensive access to it by law. The new agreement creates more legal certainty when US services like Google or Meta are used – but only if the services are certified under the new framework. These are just a few of the laws affecting tracking. Therefore, an understanding of user data is important. Conclusion: Why does Server Side Tracking offer more data protection compliance? Server Side Tracking allows the entire data processing to initially run through one's server infrastructure. This means: Tracking occurs not directly with the users but only after explicit consent and under full control of data processing on one's server. This allows the requirements of data protection laws to be better implemented, such as targeted anonymization , pseudonymization , or restriction of data sharing with third parties . Overall, Server Side Tracking enables a more data protection-compliant handling of user data, allowing companies to maintain oversight and control – which is essential under the current regulatory framework. What user data is sent with Server Side Tracking? The good news: Only the absolute minimum. What does this mean? Using Google Tag Manager as an example: When an event on the page, like a click, is triggered, an HTTP request is sent to the Server Side Google Tag Manager . Naturally, HTTP header information is sent along. This includes, among others: Time IP address Page URL Approximate location (by IP address) Operating system Browser Resolution Device Additionally, there are other parameters specifically related to configuration. Detailed information can be found in the documentation at [ https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers ]. There are also parameters automatically captured by Google Tag for campaign optimization, which include: utm_source utm_medium utm_campaign utm_content utm_term and the Click ID It should also be ensured which data is user-defined sent with the Google Tag Manager configuration. In the Server Side Google Tag Manager, users can configure precisely through the use of transformers, which specific data should be forwarded in what form and which should be withheld. However, for a data-secure implementation, the conclusion should be: “ Track only as much data as needed .” The challenge is to limit tracking to what is necessary without incurring disadvantages. We set up correct Server Side Tracking for you The Internetwarriors are a team of experts in various fields of online marketing. One of our main focuses is web analytics and Server-Side Tracking (SST). With extensive expertise and a profound understanding of the latest trends and technologies in digital analytics, we offer tailored solutions to optimize our clients' online presence. We are thus a valuable partner for you when you want to set up professional tracking that provides all the data you need for strategic decisions and monitoring your online marketing activities. Contact us now without obligation !
Why Usability Is So Important for Your Website
Jun 11, 2025

Nadine
Wolff
Category:
SEO

Website usability means that users can reach their goals without frustration, overthinking, or detours—whether their goal is to get information, make a purchase, download something, or get in touch. The easier and smoother your website is to use, the more likely visitors are to stay—and come back. Good usability not only saves time but also builds trust and improves the perception of your company or brand. Especially in the digital market, just a few seconds often determine whether someone stays or leaves. Therefore, it's crucial that your website is logically structured, easy to understand, and technically impeccable—for all users, on every device. In this article, we'll provide you with an overview of the fundamentals of website usability, explain why it's so important, and outline which aspects you should consider. What is Usability? Usability can be translated as "usability" and describes the ability to achieve the desired result without difficulty. The term "user-friendliness" is also often used. Optimal usability means that a product, service, or website fulfills its intended purpose precisely. On a website, this could mean a completed purchase, the provision of relevant information, strengthening your brand, etc., depending on the intention. Usability is a key factor in determining whether and to what extent visitors engage with your website content or move on to another site. It is an important factor in customer satisfaction and website quality. A usability analysis can help identify errors, weaknesses, and opportunities for improvement on the website. What does Usability encompass? Website usability includes all aspects that affect the user-friendliness of a website—how simple, understandable, efficient, and pleasant it is to use. Many factors interact to create a positive user experience. Specifically, they include: Navigation and Structure: Clear page layout (e.g., an easy-to-navigate menu, logical hierarchy), simple, logical navigation, breadcrumb navigation, and an easily searchable effective search function Layout and Design: A consistent layout across all pages, a visual hierarchy (important elements stand out), responsive design (works on all devices), and appropriate use of colors, font sizes, and spacing Content and Language: Clear, understandable language, relevant, up-to-date content, good readability (paragraphs, titles, lists), and accessibility (e.g., alt text, contrasts, keyboard accessibility) Interactivity and Feedback: Meaningful feedback (e.g., after clicks, forms), helpful and friendly error messages, and buttons and links that are clearly recognizable Loading Times and Performance: Fast loading times of pages and content and technical stability (no crashes, malfunctions) Accessibility: Support for screen readers, keyboard operability, good color contrasts, and scalable font sizes Trust and Security: SSL encryption, transparent data protection information, and a professional appearance (e.g., legal notice, contact information) Conversion Support: Clear calls to action (e.g., "Buy Now," "Learn More"), no distractions from the actual purpose of the page, and support for processes like forms or checkout What are the Goals of Website Usability? The goals of website usability are aimed at designing websites so they can be used effectively, efficiently, and satisfactorily by users. Interaction should be as intuitive as possible to lead to the desired outcome. Jakob Nielsen - one of the leading experts on usability, who has been studying the topic for decades - developed 5 key criteria of usability : 1. Learnability Goal: New users should be able to quickly understand and use the website. Navigation, structure, and functions must be intuitive . Example: A first-time visitor immediately understands where to find information or how to make a purchase. 2. Efficiency Goal: Experienced users should be able to complete their tasks quickly and effectively . Optimized workflows, fast loading times, and clear paths lead to goal achievement. Example: A repeat customer can reorder with just a few clicks. 3. Memorability Goal: Users who haven’t visited the website for a while should be able to easily use it again. This is helped by consistent design, familiar symbols, and a logical structure. Example: A user remembers how to find customer support even after weeks. 4. Errors (Error Prevention and Handling) Goal: The website should be as error-tolerant as possible and help users avoid or correct errors. This can be achieved through clear and understandable error messages, opportunities for correction, or well-thought-out forms. Example: An incomplete form clearly indicates what is missing without deleting all input. 5. Satisfaction Goal: Using the website should be a pleasant experience and generate positive feelings. An appealing design, useful content, and easy navigation lead to higher satisfaction among users. Example: A user-friendly interface with clear texts provides trust and enjoyment while using the site. How Can Usability Be Measured? Measuring usability means systematically evaluating the user-friendliness of a product or website using specific criteria, tasks, and user tests. While usability is partly subjective, it can be objectively measured if you use the right methods. Aspects of the above five listed criteria - learnability, efficiency, memorability, error prevention, and satisfaction - are often tested. Various methods are employed for measurement: Usability tests with users who have to perform specific tasks Questionnaires filled out by users after a test An expert examination based on specific criteria (does not replace user testing) Analysis of analytics and user behavior through web analysis tools, heatmaps, or mouse tracking A/B testing - different versions are tested against specific questions and data Why is Usability Important? There are numerous positive impacts of good website usability and just as many reasons why you should pay attention to good user-friendliness. It makes it easier for users to navigate your website, helps you stand out from the competition, and enhances the impression you leave on potential customers. Let's take a closer look at the individual reasons: Users Want to Be Guided A good website makes it easier for users to navigate. It allows them to quickly get to their desired goal (completing a purchase, downloading a document, finding the needed information) without thinking too much. The offered content should be self-explanatory in terms of the positive user experience. In fact, users don't want to think about how to do something on a website. They want to be guided and don't want to search laboriously for certain elements and "discover" the website on their own. They want to intuitively reach their desired goal as quickly as possible. The content offered needs to be self-explanatory for a positive user experience, and any interactive elements should be directly usable. Website Usability as a Competitive Factor The lack of willingness to stay on a website just because is partly due to the market situation. In the online world, a competitor is just one click away. Unlike a physical shopping street, there's no need to walk past a series of stores. The reality is usually different. Most companies don’t have the luxury of offering something so unique (services, physical products, digital products like apps) that users would endure a cumbersome website. Instead, well-prepared and easily accessible website content has the power to convince users to complete conversions on your site rather than a competitor's, making usability a crucial competitive factor. Your website needs to make a good impression from the first second. Many website owners are unaware that users often judge the entire company based on their first impression of the website. A poorly maintained website can negatively impact the entire company's image. Users might question whether the company is capable of offering good service if it can't even manage to keep basic things on the website in order. While this assumption is often incorrect, it illustrates the potential consequences of poor website usability. Usability is More Than Just Good Design The challenge in website design is that it's unlike any other medium because people often have different expectations for how websites are used versus how they're actually used. When website owners and designers sketch out a site, they have a certain image in mind. It's often assumed that users will study all the content thoroughly. But in reality, users often only skim through a page initially. They quickly scan the text and click on the first link that seems interesting or appears to be the closest to their searched goal. As a result, a large portion of the page might not be actively noticed by your users. How users read a webpage greatly depends on their goal in mind. They focus on words and phrases that match their personal interests, known as trigger words. If they don't find these on your website, it can still be beautifully designed and contain seemingly informative content, but it won’t hold their interest. Technical Performance is Part of Usability too Besides graphical design, technical preparation is a key aspect of usability. While large images, videos, interactive graphics, and other moving elements may look visually appealing, they significantly increase loading time. The problem is not all users have access to a fast internet connection. Ideally, a good load time is important for two main reasons: it reduces the chance that visitors lose patience and leave the website, and page load time is becoming an increasingly important ranking factor for search engines. Usability Engineering Usability engineering is a structured and systematic process for developing user-friendly systems, where the usability of a product—or, in this case, a website—is deliberately planned, tested, and improved throughout the development process. This means that usability should ideally be considered from the conception of a website or a relaunch. Good usability doesn't happen by chance; it's systematically planned and tested. SEO and Usability – A Strong Team SEO ensures the necessary flow of traffic through organic search results. But you shouldn't focus solely on generating traffic. Without usability, your visitors will quickly leave again, often without converting. If you only focus on usability, you will reach a significantly smaller group of visitors who may convert well, but due to the small number, may not generate enough revenue to financially sustain the company behind it. Choosing usability doesn't necessarily mean writing off SEO. If you think SEO is just about ranking first in Google's organic search results and driving traffic to the website, you're not thinking broadly enough. SEO measures aim not only for top placement in search results but also ensure that users can find their way around the site. This goal is shared with usability. Classic OnPage SEO measures include, for example: Clear page structure Logical navigation Breadcrumbs and HTML sitemaps Pagination Avoidance of 404 error pages Logical internal linking On closer inspection, it becomes clear that these points simultaneously significantly improve user navigation. So, there are indeed significant intersections between the two online marketing disciplines, which do not contradict each other. Good SEO specialists and user experience managers are aware of this close connection and ensure that all elements of the website complement each other effectively. Website Usability and Accessibility/Accessibility In the context of website usability, the topic of accessibility is also important. Both disciplines pursue the same goal of facilitating the use of websites, but they are approached from different perspectives: Usability aims to improve user-friendliness for as many people as possible. Accessibility seeks to make the website fully usable for people with disabilities (especially visual and hearing impairments). Numerous measures improve both web usability and accessibility (for example, a clear structure and understandable language or good contrasts). Accessibility also requires compliance with technical standards such as the WCAG (Web Content Accessibility Guidelines), including screen reader compatibility or alternative texts for images. Since the Accessibility Enhancement Act , you should assess (or have assessed) whether your website is not only user-friendly but also accessible. This is important to ensure that people with disabilities can use your website without restriction. Usability issues usually only become apparent when expected conversions don’t occur. Retrospective improvements can be expensive or, in extreme cases, unworkable. That's why you should focus on a good user experience at the launch or relaunch of your website. However, this doesn't mean that you should completely disregard usability in favor of SEO. Those who believe that SEO is only about ranking first in organic search results and driving traffic to the website are thinking too narrowly. In addition to good placement in organic search results, SEO measures also aim to ensure that users can navigate the website. This goal is shared by search engine optimization with usability.
Show more posts