
- Blog Posts


- Blog Posts

- Blog Posts
All Categories
SEO
SEA
Social Media
Web Analytics
Growth Marketing
Artificial Intelligence
Success Stories
Inside internetwarriors
Search engine optimization with Bing SEO
Oct 16, 2019

Thorsten
Abrahamczik
Category:
SEO

As a website operator, you want your site to be easily found. You often think of Google, but not of the search engine Bing, right? For many website operators, search engine optimization is an abstract topic, and Bing SEO often gets pushed into the background. This is probably because Germany is a Google country. According to Statcounter, Google had a market share of 94.5% in Germany in 2019, while Bing only had 3%. [caption id="attachment_25213" align="aligncenter" width="1024"] Fig. 1 Statcounter statistics on search engine usage in Germany in 2019[/caption] What features does Bing offer? With Bing's previously unheard-of preview function in search engines, users receive useful information about content and internal links even before they enter the website. Here, the displayed content is actually drawn from the page content and not, as with a snippet, from a meta-description. [caption id="attachment_25221" align="aligncenter" width="1024"] Fig. 1: Statcounter statistics on search engine usage in Germany in 2019[/caption] Still, you shouldn't ignore the Bing search engine. Users who come via Bing often have better user behavior than those from Google. They typically view more pages, stay longer on the domain, and have a higher conversion rate. It may therefore be worthwhile for you as a provider to engage this audience. Also, the results of Apple’s Siri service and Amazon’s Alexa are based on Bing. [caption id="attachment_25223" align="aligncenter" width="1024"] Fig. 2: Comparison of user behavior on Bing with Google[/caption] Commonalities between Bing and Google In principle, you are already optimizing your website for Bing if you carry out search engine optimization according to Google guidelines. This is partly because Bing uses similar criteria. It is also because some SEO standards from Bing, Google, and other search engines were created in alliance together. An example of this is the structured data from schema.org . Further aids for Bing search engine optimization A look at Bing's search results page is also very helpful. Bing specifies much more precisely where the data comes from. In addition to pure search engine optimization, revising other profiles on the Internet such as Xing, Wikipedia, etc., can also lead to better visibility on Bing. The optimization of multimedia elements on the website can also show better effects than on Google. This is visible with Bing image search or video results. By strategically using structured data and consistent use of alt titles, you can position images and videos very well in the search engine. Furthermore, Bing search offers some options that are not available in Google. For example, individual elements in images can be selected for search. Videos are supplemented in the results with a lot of information, such as whether it is a trailer. Also, videos can already be watched in preview, and Bing considers the search query history to display the most suitable results. [caption id="attachment_25225" align="aligncenter" width="1024"] Fig. 3: Identification of elements in Bing image search[/caption] What to pay attention to Microsoft offers various guidelines and tools for its search engine to support users in achieving the best possible search engine optimization. The following three are the most relevant from our perspective: Bing Webmaster Tools Submit sitemap.xml Block URLs Monitor reports Control crawls Test mobile-friendliness We warmly recommend using the Bing Webmaster Tools to gain better control options. Of course, the domain can also rank in the Bing search engine even if it has not been registered there. Nonetheless, it is interesting to see how Bing assesses the website compared to Google. Bing Webmaster Guidelines Bing offers the very informative Bing Webmaster Guidelines on its websites. These provide users with initial help on the topic of search engine optimization. The instructions are thematically subdivided and clearly presented. They are helpful for both experienced users and newcomers to the topic of Bing SEO who want to know what the search engine operator values. Markup Validator If you use structured data on your website, you can check the correct implementation with the Bing Markup Validator . Supported formats are: HTML Microdata Microformats RDFa schema.org OpenGraph from Facebook Please note, however, that you must register to use the tool with the Bing Webmaster Tools. This also applies to the SEO Analyzer and the keyword research tool. Peculiarities of the Bing search engine We generally want to give you three tips that will help you with Bing SEO: 1. Exact Match URLs: Bing places much more emphasis on the correct name in the domain than Google does. Therefore, if you own a domain that includes your brand name, it will automatically rank better on Bing. 2. JavaScript Currently, Bing is not as advanced as Google in accounting for JavaScript on websites (as of 10.2019). This means that while your site may rank well on Google, it might be placed lower on Bing. Sites that dynamically load content through JavaScript are particularly affected. However, if your site uses minimal JavaScript or does not load content dynamically, you can overlook this point. 3. Google Adjustments Google frequently makes adjustments to its SEO guidelines, suddenly weighting certain features more or less. The last known adjustment was the removal of "noindex" from the robots.txt file. It's important to note with these changes that they apply only to Google and not automatically to Bing. So don't be alarmed by them. Conclusion The Bing search engine offers many unique features to differentiate itself from its bigger rival, Google. At the same time, its users exhibit consistently better behavior on websites, regardless of the industry. To rank well here, you only need to make a few specific adjustments, but you can use Bing's excellent tools to closely monitor your developments. How can we support you? Do you want to increase your conversions and revenue through organic search? Are you already ranking well on Google and now want to kick off with Bing? We support you in Bing SEO and thus achieve higher rankings in the Bing search engine. Contact us here , we look forward to your inquiry!
On-page SEO Strategies for Better Search Engine Optimization
Jun 18, 2019

Thorsten
Abrahamczik
Category:
SEO

SEO is a commonly used term among online marketing managers and often comes in conjunction with off-page optimization . Both terms describe the search engine optimization of your own website . The aim of targeted SEO optimization is to improve the ranking of your own website in search engines, generate more traffic, and ultimately lead to more conversions or leads. However, marketing managers usually only know a portion of the actions that can be performed. In this article, we explain what on-page SEO is all about and offer initial tips for practical implementation . On the internet, you will come across various spellings, but it always means the same thing. Possible terms are: Onpage SEO, Onpage optimization, On-page optimization, On Page optimization, SEO on-page, SEO optimization, or even on-site optimization. On-page Analysis – The first step in On-page SEO At the beginning of your work, you should perform an onsite analysis of the entire website. First, you analyze the technical SEO to identify possible technical pitfalls that could affect the crawling of the website. In the next step, you examine the contents of the website in order to carry out content optimization. After investigating both areas, you can derive a prioritized action plan that you implement step by step. To track the development of your website in SEO, you can use Google's Google Search Console. This serves as an interface between website operators and Google, showing which URLs rank well, for which keywords the domain can be found, and where there are problems on the website. It also visually represents how often individual pages are displayed on Google's search result pages. Technical SEO – The Foundation of On-page Optimization Before you can start optimizing your content, your technical SEO must be implemented flawlessly. This ensures that the website can be crawled easily by search engines and that all content is read and processed accurately . If technical SEO is not implemented correctly, it may happen that content optimizations, the so-called content optimization, do not take effect properly because the search engines do not have access to the content. Therefore, technical SEO is the foundation of every SEO on-page optimization. Crawling – Can the search engine reach your website? Crawling solely refers to accessing content and has nothing to do with indexing. Search engines like Google certainly crawl pages, but they do not index all of them. This can occur for various reasons, including poor search engine optimization. To improve the relevance of content, only the content that needs to be processed by search engines should be made crawlable. Pages that are not important, such as search results pages or the imprint, should be excluded from crawling. This is referred to as optimizing the so-called crawl budget. It determines how many pages of a website can be crawled and is set individually by search engines for each domain. In this context, there is an important distinction: Crawling is not equivalent to indexing! To control crawling, use the meta tag "robots" and the "robots.txt" file . However, it should be noted that the instructions contained in the "robots.txt" are merely a suggestion to search engines and can be ignored entirely. Furthermore, search engines can also reach excluded pages through other ways, such as backlinks. Therefore, using the "Disallow" function is not a reliable method to exclude pages from crawling. On the contrary, incorrect use of the robots.txt can lead to major problems on search results pages, as Figure 1 illustrates. Figure 1: Websites indexed by Google for which no descriptions can be displayed because they are excluded from crawling in the robots.txt. In May 2019, Google updated its in-house Google Crawler to the current Chromium version 74. This is an important note because its predecessor was outdated and supported only few modern web technologies. The new crawler can now recognize modern SEO optimization and perform better on-page analyses. SUBSCRIBE TO NEWSLETTER NOW URL Structure, Internal Linking, and sitemap.xml are important SEO On-page Factors Next, you need to review the URL structure of your website. Key questions are: Is it readable and clearly understandable for humans? Can users tell where they are on the website? Is it not too long? The URL structure should not exceed 5 hierarchy levels . This ties in with the structure of the website. Today's on-page SEO is not about optimization for search engines, but for users. This means that content must be easily and quickly accessible . Excessive cascading of individual webpages is therefore not advisable, as it only leads to more hierarchies. John Mueller, Webmaster Trends Analyst at Google, announced on 03/05/2019 via a Webmaster Hangout that the internal linking of pages should be weighted even more than the URL structure. Visible URLs are primarily relevant for user experience. Internal linking of pages should instead be done in a way that is relevant to the topic, to achieve good on-page optimization. In this context, it is also referred to as siloing. For good internal linking, you must ensure that links are made only within a topic (silo). For example, all articles on web analytics should link to each other, but not to articles on SEO. Therefore, you should always develop a linking concept and check, especially with links that have JavaScript functionality, whether crawlers can find and follow the links. If you have similar content, you should use the canonical tag. Imagine you run an online shop and offer a t-shirt in five colors. You provide a separate page with a unique URL for each of these color variants. In this case, you would have duplicate content on the domain because all pages would be textually identical and would only differ through the color specification. With the canonical tag, you can indicate on all pages which of the five URLs is actually relevant to search engines and which serve solely as added value for the user. Additionally, you should provide search engines with a sitemap.xml file , a file listing all the URLs that are to be indexed. Therefore, this file should not include URLs set to "noindex" or those excluded from crawling or indexing in any other way. Avoid Duplicate Content with SEO On-page The topic of duplicate content or duplicate content is one of the main focuses of SEO on-page optimization. Often, the same pages are accessible under multiple URLs. Classic examples are: URL with http and https URL with www and without www URL with trailing slash and without URL with very similar content For this reason, it is essential to set up redirects and communicate very precisely to search engines via canonical tags and sitemap.xml which websites should actually be indexed. Imagine you run an online shop and offer a t-shirt in five colors. You create a separate page with a unique URL for each of these color variants. In this case, you have duplicate content on the domain because all pages are textually identical and differ only in color indication. With the canonical tag, you can indicate on all pages which of the five URLs is actually relevant for search engines and which pages serve solely as added value for the user. During a redirect, users calling a URL with http are automatically redirected to the https variant. This happens so quickly that they often do not even notice it. Pagespeed – How Fast Does Your Website Load? For several years now, Google has recorded more accesses via mobile devices than via desktop devices. It is therefore important to offer a fast-loading website . This can be implemented in several ways: Ensure small file sizes and short source codes. This optimizes the website as a whole and ensures a good user experience. Optimize visible content through prioritized delivery of source code . With this measure, users see the first content in the visible area long before the entire website is loaded. This improves the perceived load time. Switch the Hypertext Transfer Protocol (http) to version 2 (http/2). This is optimized for mobile devices and allows parallel loading of various files, as well as preloading of content. We also recommend using https for good on-page SEO. Figure 2 illustrates the Google Page Speed Test, which examines exactly these topics: Figure 2: With the Google Page Speed Test, you can see which files are too large and which files delay optimized delivery of visible content. Mobile Optimization is Becoming Increasingly Important In addition to the load time, the display of the website on mobile devices must also be ensured. This is often achieved with responsive design, which ensures that the website adapts automatically to the screen size of mobile devices. If you are planning a relaunch soon, this issue must be addressed in your concept. Structured Data for Good Onsite Marketing With structured data, individual content is specifically tagged for search engines. This may include company information, product information, recipes, events, or, more recently, FAQs. There are many templates for using structured data on a website, although only relatively few are supported in Google optimization. The benefit is that Google can better understand the content and display it separately on the search results page. Further SEO On-page Optimizations The technical SEO area also includes further measures such as Progressive Web Apps, image tagging, or multilingualism. These are the "fine-tuning" aspects in the field of technical SEO optimization. Therefore, these detailed topics are not elaborated here. Content Optimization – The Second Step in On-page SEO Once the website has been improved with technical on-page SEO measures, you can now start optimizing your content. This is basically divided into two areas: Meta Information: Optimizing Page Title and Meta Description The optimization of the page title and meta description is referred to in on-page SEO as the optimization of meta information. The page title as an SEO criterion is particularly crucial as it should contain the keyword and the brand. At the same time, it should not be too long to remain readable. You always see the page title in the tab of your browser as well as on Google's search results page. The meta description itself is not SEO relevant but still has a significant indirect impact on onsite marketing. The meta description is displayed on search results pages of search engines if relevant. By directly addressing users with a call to action, you can improve the click-through rate on your results and thus achieve a better ranking. Figure 3: Google search results with the page title in blue and the meta description in black. Content on the Page Itself – What Can the User Expect? The actual content must meet user expectations, otherwise, it will lead to high bounce rates. Remember that content should be created for users in the context of on-page optimization, not for search engines. For this reason, Google places great emphasis on how readable the content is. Furthermore, the texts must be well-structured and organized with headings. Increasing user interaction with the website is also desirable. This can be achieved through videos, images, image galleries, comments, or similar features. It is proven that longer dwell times are associated with an increase in conversions. If the content is well created, Google may use it as a Featured Snippet. This is the position 0 on the search results page, where Google directly answers the user's question on the search results page. Marketers have no influence over the use of a Featured Snippet and cannot predict when a snippet will be displayed and when not. Getting your own content into a Featured Snippet is therefore considered the pinnacle of Google optimization for content. Figure 4: Representation of a Featured Snippet on Google's search results page. Google Jobs – A Brand New Feature With Google Jobs, the search engine giant introduced a brand new feature in Germany in June 2019, which will lead to many on-page SEO optimizations in 2019. When users search for a job title, they are shown a list of available job offers. After clicking, they are directly redirected to the company's page, where they can apply for the job in the next step. However, to do this, website operators must use structured data and provide very specific content on the website. What We Can Do for You If you want to improve your positions in search engine rankings and thereby increase your number of conversions, we offer comprehensive support in the area of on-page optimization. Our on-page SEO measures are coordinated with other online marketing measures. Feel free to contact us, we look forward to your inquiry.
Opt-In, Initial Insights from Practice
Jun 28, 2018

Thorsten
Abrahamczik
Category:
Web Analytics

Opt-In – Impact on Online Marketing through the EU Cookie Directive Under the new General Data Protection Regulation (GDPR), many marketers have experienced significant confusion regarding the EU Cookie Directive and the Opt-In and Opt-Out procedures. Additionally, there is uncertainty about the e-Privacy regulation, which is expected to become mandatory in 2019. In our article "No Google Analytics without Google Analytics Opt-Out Cookie" , we have already discussed the necessity of a Google Analytics Opt-Out Cookie on the privacy page. In this article, we want to explain the Opt-In procedure, which requires the explicit consent of the user for analysis and marketing measures. We will also illustrate how this procedure affects all online marketing channels and activities. What is the Opt-In procedure? The Opt-In procedure is based on the increasingly popular cookie notice, which mentions the use of cookies on websites. As shown in images 1 and 2, the formulations were revised on May 25, 2018, and supplemented with additional information on the use of cookies in many cases. Fig. 1 Old cookie notice as it was used on https://www.internetwarriors.de before GDPR. Fig.2 Current notice, allowing users to exclude themselves from tracking. Since this revision, many users have had the opportunity to agree to or decline the use of cookies on individual websites. Once users make a decision here, the use of cookies, aside from specific exceptions like session cookies, must be respected across the entire domain. However, many lawyers and data protection officers interpret the GDPR differently, resulting in users being offered various solutions. These range from simple cookie banners without selection options to Opt-In procedures. The impacts of the Opt-Out procedure are already known. However, with the Opt-In procedure, only a very few companies have experience. For this reason, we tested the Opt-In procedure within the framework of GDPR cookies to gather initial insights that we can consider in future implementations. Distinction between Opt-In and Double Opt-In Before we begin with the implementation and the impact on traffic, we need to differentiate between Opt-In and Double Opt-In: Opt-In: An information banner is displayed to the user on accessing the website, informing them about the use of cookies and, if necessary, their purposes. The user must also explicitly consent to the use of cookies before web analysis and marketing measures may be carried out. If they do not, neither tool may be used. Double Opt-In: This procedure is primarily used in email marketing. Upon subscribing to a newsletter, the user receives a confirmation email, requiring them to actively confirm their subscription. As you can see, both procedures are independent of each other and have nothing in common. Changes in traffic due to the implementation of Opt-In As part of our Opt-In investigation, we examined the traffic development on 10 websites in Google Analytics before and after implementing the Opt-In. The Google Analytics screenshot in image 3 shows the number of sessions of a website on a daily basis, before and after the implementation of Opt-In. Fig. 3 Traffic development from May 24, 2018, to June 21, 2018. Opt-In was implemented on June 8, 2018. Comparing the period after implementation with the period before implementation and excluding the day of implementation, the following traffic changes arise: Fig. 4 Comparison of developments in Google Analytics in the periods June 9, 2018 – June 15, 2018, and June 1, 2018 – June 7, 2018 Other websites with about 5,000 sessions per day even show deviations of 83% - 85%. Only a few websites have a smaller deviation than shown in the screenshots here. Configuring Opt-In with a Step-by-Step Guide To help you understand how the entire procedure works, we would like to give you a detailed Step-by-Step explanation. Additionally, at the end of the article, we offer you the chance to download our configuration of the Google Tag Manager container so that you can import it into your Google Tag Manager and gain experience with the implementation. GDPR Cookie Notice on the Website An essential requirement is a cookie banner on your website, informing users about the use of cookies and giving them the option to activate or leave analysis and advertisement cookies deactivated. For simplicity's sake, we use the popular solution Consent by Insites for our attempt. We have also integrated this script on our website. Via the Download menu, you can configure a banner that you only need to copy into the source code of your website afterward. During this process, you have to decide whether you want to use the Opt-In or Opt-Out procedure. In our current scenario, we use the cookie notice for Opt-In. Subsequently, the banner is displayed immediately. Thus, implementation is very easy to carry out even for less technically skilled individuals. Storing User Decision in a Cookie Once you've embedded the banner on your website, it will be displayed to all users. However, initially, nothing else happens, no cookies are blocked yet. Insites itself uses a cookie named "cookieconsent_status" to store the user's decision and not display the banner again on their next visit. This decision is valid for one year. The cookie values can be seen in image 5: "allow" for consent "dismiss" for rejection You also get the expiration date of the cookie, from which the browser no longer considers the cookie. We can read the "allow" and "dismiss" values with the Google Tag Manager and take them into account for triggering Google Analytics, Google AdWords, Affiliate, etc. The decision of an Opt-In should not be limited to web analytics using Google Analytics, Etracker, Webtrekk, etc. alone. All remarketing and conversion tracking from other providers should also be considered. Fig. 5: Status of the "cookieconsent_status" cookie for Opt-In Consideration of Do Not Track After considering the GDPR cookie decision by the user, we want to consider a second option of rejecting analysis and marketing cookies. This involves the Do Not Track procedure . In this case, the browser sends information to the server with each new page view that no user profile should be created and personal activities should not be tracked. Image 6 shows the setting in Firefox's "Privacy & Security" section. Fig. 6: Activation of the "Do Not Track" information in Firefox's privacy settings Do Not Track is integrated into all relevant browsers like Google Chrome, Mozilla Firefox, Apple Safari, etc., but is disabled by default. Therefore, the user must make a conscious decision and manually enable Do Not Track. If they do, website operators should respect this decision if they offer the Opt-In procedure. Interaction of individual configurations in the Google Tag Manager To configure Opt-In in the Google Tag Manager and consider GDPR relevant cookies, we have defined the following rules: Has the user explicitly agreed to the use of cookies for Opt-In? If yes, we check whether the user has activated Do Not Track If no, we keep all tracking disabled Has the user activated Do Not Track If yes, we keep all tracking disabled. This rule also overrides the previous rule if the user has agreed to tracking on the banner If no, we check whether the user has consented to the use of cookies. Only if both conditions are fulfilled will analysis and marketing cookies be activated The user has consented to the use of cookies The user has deactivated Do Not Track If even one value deviates, the cookies remain blocked. This way, the website operator offers maximum protection for users from cookie capture. At the beginning of the article, we showed that this setting in the Google Tag Manager resulted in significant traffic loss in Google Analytics. But since all remarketing and conversion tracking is also blocked, website operators can no longer tag their users and can measure success significantly less. Measuring Do Not Track Usage on Another Website Currently, according to our non-legally binding understanding, there is no obligation to use Opt-In tracking. However, this may change with the e-Privacy regulation in 2019. Regardless, it is not known to us that the Do Not Track feature is a mandatory measure for website operators. For this reason, we analyzed the use of Do Not Track on an eleventh site. This site serves family entertainment and is characterized by a high national as well as international traffic. It also serves both genders and age groups from infants to great-grandparents. We consider these numbers a good cross-section of society. In image 7, we have juxtaposed the number of sessions and accesses with activated Do Not Track. For measuring activated Do Not Track, we use "Unique Events" in Google Analytics, as this value is "session-based" and thus provides a comparable data basis. Fig. 7: At 10% of all sessions, Do Not Track is activated in the period June 13, 2018 – June 20, 2018 The collection period is June 13, 2018 – June 20, 2018. It is clearly visible that in 10% of all sessions, Do Not Track is activated. Here, users have made a very conscious decision not to be tracked. Learnings from the Test These are very valuable and important insights for us. The Opt-In procedure significantly reduces the metrics in the analysis and marketing tools and makes it considerably more difficult to capture users. If the use of Opt-In becomes mandatory, other methods would need to be developed to continue offering online marketing in the same quality. If you, as a website operator, need to decide between Opt-In or Opt-Out, you now know the pros and cons. We are also happy to offer our Google Tag Manager container configuration for download. Fill out the following form, and we will send you the download link by email. The .zip file can be easily opened, and you will find a .json file inside. When you are in your Google Tag Manager, click on "Admin" and then on "Import Container." Subsequently, select the .json file and import it into your Google Tag Manager container. You will then find all the templates we created. It will be exciting to see how the e-Privacy regulation impacts the EU Cookie Directive and how users take advantage of Opt-In options. What can we do for you? If you are unsure whether you need Opt-In tracking or if you experience difficulties implementing the Opt-In or Opt-Out procedures, we are here to help you. We support you in the implementation of your online marketing strategies and can quickly make technical adjustments to your website, should the new e-Privacy regulation require it.
Early configuration of Google Tag Manager with the Tag Manager Injector
Aug 31, 2017

Thorsten
Abrahamczik
Category:
Web Analytics

The Tag Manager Injector, as a Chrome browser plugin, allows for easy setup of the Google Tag Manager within Google Chrome. Without long waiting times, we can start servicing our clients and are not forced to wait. Meanwhile, the client's IT can take its time integrating the Tag Manager code into the website's source code. At the beginning of our collaboration, we work closely with our clients to create tailored tracking concepts. At this point, it is often not yet clear what exactly needs to be tracked and which metrics provide value to the client. Once the tracking concept is finalized and approved by the client, we begin implementing it. First, the Google Tag Manager code needs to be embedded into the site's source code. However, this can typically take several days, as the client's IT may not be able to implement it immediately. To start configuring in the meantime, we use the Chrome browser plugin Tag Manager Injector. This makes us much more independent from the client's IT and allows us to work faster. Requirements for Using the Tag Manager Injector To use the Tag Manager Injector, a Google Tag Manager account must first be created. Within this account, a container must be created. The Google Tag Manager then provides the code for the container, which the IT must integrate. At this point, we can already start using the Tag Manager Injector to configure the newly created container. In this context, the unique container ID, which identifies this particular container, is important. Fig. 1: The container created in Google Tag Manager with the container ID Using the Tag Manager Injector After this process is completed, you go into the newly created container and create the first tags, triggers, and variables. At this point, the container can already be configured as it is intended to be used later. There are no limitations here, as third-party tags and scripts can be used. After setting up the initial tags, Google Tag Manager's preview mode should be activated. This way, you can check whether the tags have been configured correctly. The next step is to install the plugin. Once the plugin is active, it can be used. To do this, click on the plugin itself and you will see a simple input screen where you enter the container ID of the previously created container. Next, under the point “Include Domain(s)”, enter the domain where the Google Tag Manager container should be used. Then, just click the “Start” button. Once the Tag Manager Injector is active, the area at the GTM container ID turns green. Fig. 2: The configured and usable Tag Manager Injector By activating preview mode in the container, the regular Google Tag Manager preview window opens on the website in the browser. It's now easy to see which tags fire, what variable values are displayed, and what the data layer looks like. If you want to push specific values to the data layer, the Tag Manager Injector offers a clearly visible input field “Push to the data Layer”. You just need to enter the respective information there, and the data will be transmitted. Fig. 3: The Google Tag Manager preview window in the browser In the real-time report of Google Analytics, initial accesses can now be seen, showing that Google Analytics tracking is working. How Can We Help You? Would you like to implement web analytics on your page according to a defined tracking concept but are unsure how to do this exactly? Do you want to measure and increase conversions more effectively but have problems implementing additional tracking? Contact us and we will gladly help you improve your web analytics.
Record entries of domain management and what you need to consider when making adjustments
Jun 22, 2017

Thorsten
Abrahamczik
Category:
SEO

As a website operator, you must also deal with the management of your domain. This involves not just purchasing a domain but also configuring it for web servers and email servers. This primarily concerns larger companies or agencies that manage domains on separate servers, which are operated by the actual web or email servers. An example of such a scenario includes providers like NICdirect or 1Blu, where domains are purchased and managed, with a specialization in hosting large enterprises. However, regular web hosts like Mittwald, Webgo, Host Europe, etc., also offer domain configuration to some extent. Fig.1: Record entries for a domain of internetwarriors GmbH The Configuration of the Domain Name System The Domain Name System, often abbreviated as DNS, is one of the most important services on the internet. So-called DNS servers ensure that URL names are converted into IP addresses. When you type a URL into the browser, servers on the internet cannot initially do anything with that because they use IP addresses to identify themselves. An IP address is a unique numerical combination assigned individually to each device on the internet. This is most easily compared to a phone number assigned to every phone line. To know which server has requested a webpage (the URL typed into the browser), the browser first sends a request to a DNS server. This server maintains a large database that stores the IP address of the corresponding server for each domain, similar to a phone book where each name is paired with a phone number. In response to the request, the DNS server sends back the IP address of the corresponding server for the webpage to the browser. The browser can then directly place the request for the webpage with the actual web server. Types of Record Entries For the DNS server to know which IP address is behind a URL, this must be set in the domain configuration. There are various so-called record entries for different services. The most important ones are: NS A AAAA MX Name Server Record - NS The Name Server Record, often referred to as NS, is responsible for name resolution. This means resolving the names of services, e.g., domains, into computer-readable addresses. These addresses are the so-called IP addresses. Each of the entries has a so-called TTL (Time to Live). This determines how long an entry remains valid in the cache before it must be renewed. Typically, this value is 86400 seconds, which means 24 hours. The relocation of a domain to a new web server correspondingly takes 24 hours, as all global DNS servers must first be updated before the correct IP address is delivered to browsers. Address Record - A The Address Record or A Resource Record ensures that an IPv4 address is assigned to an entry on a DNS server. IPv4 addresses are IP addresses based on a four-octet system. This is greatly limited in the number of possible IP addresses and can no longer cover the required IP addresses for all devices connected to the internet, e.g., computers, smartphones, servers, etc. Nevertheless, it is still very common today. Address Record - AAAA The Address Record AAAA essentially provides the same functionality as the Address Record A. However, it is based on the so-called IPv6 addressing system, which is the successor of IPv4. It offers significantly more IP addresses and can therefore cover a much larger number of IP addresses and DNS entries. Mail Exchange Record - MX An MX Resource Record describes under which domain the corresponding email server can be reached. Through this, email programs can send and receive their emails. It is important to note that this entry must always include a fully spelled-out URL. What Can We Do for You? With our many years of experience in web hosting and domain management, we are happy to assist you in managing your website. Please contact us if you do not want to handle the technical management of your website or are planning a relaunch. We would be very pleased to discuss your individual support needs with you.
A/B Testing with Google Optimize
May 11, 2017

Thorsten
Abrahamczik
Category:
Web Analytics

With Google Optimize , the search engine provider has introduced a new tool for conducting experiments on websites. Originally introduced as part of the Google 360 Suite, Google now offers the program, with a few restrictions, as a free version. This makes it easy for all marketers to use the tool for their own experiments. Fig. 1: The homepage of Google Optimize In addition to A/B tests, the program also supports multivariate and redirect tests. The following distinguishes the different types of tests: A/B Testing: This involves testing individual variations of the same webpage. Typically, the variations differ only in small parts, such as a different button color or a new call to action. Multivariate Tests: These tests work similarly to A/B tests, but in this case, several elements of a page are tested to find the best possible combination of elements. At the same time, it allows better investigation of user interaction between individual variations for conversion optimization. This quickly leads to a significantly larger number of variations. Redirect Tests: In these tests, separate pages with their own URLs are tested against each other. This way, different versions of entire pages can be effectively tested. Creating an Account and Container in Google Optimize To get started, users must open the homepage URL of Google Optimize . Upon initial opening, email subscriptions for (Tips & Recommendations, Product Announcements, and Market Research) should be considered, but they can also be declined. In the next step, the user must configure their account settings once. Fig. 2: One-time configuration of the Google Optimize account After making the changes visible in Figure 2, you can immediately begin setting up a website test. For the user, an account and a container are immediately created in which tests can be managed and configured. The setup is identical to the Google Tag Manager, which also relies on an account with individual containers. This significantly facilitates operation. Start with a First Test Before marketers begin creating a test, they must ensure that the website to be tested has many visitors. Only then can valid data be collected. If a website receives only a few visits per month, the evaluation of a test takes much longer to obtain statistically valid tests. In this case, only a few variations should be tested. In addition, marketers should conduct only small tests initially, such as changing a button color or swapping out text. This allows them to learn how to use the tool and understand how to build meaningful and effective tests. More complex tests can be created later. To start a test, the user must create a new experiment. Google offers templates for this, in which the user must select a name for the experiment, a URL for the page to be examined, and the type of test. Figure 3 shows the corresponding screen from Google Optimize. Fig. 3: These details can be used to create an experiment in Google Optimize. Work with Variations Once an experiment is created, the user can create so-called variations. These are slight alterations within the website. Regardless of the number of variations, each variation is always tested against the original version of the website. The marketer can also specify at this point how much traffic should participate in the test and how much traffic each variation should receive. By default, 100% of the traffic participates in an A/B test, and this traffic is evenly distributed across all variations. So, if there is the original version of the website and one variant version, each will receive 50% of the traffic. Fig. 4: For each test, a goal and a hypothesis must be set. After the variations are created, goals and descriptions must be set for the test. Examples of test goals include: Reducing bounce rates Increasing the number of page views Increasing the number of transactions Subsequently, the individual variations must be configured. A visual editor is used for this, allowing marketers to make directly visible changes to the website. For small changes, no knowledge of HTML, CSS, or JavaScript is necessary. For more complex changes involving HTML, CSS, or JavaScript, a general technical understanding of HTML and CSS is certainly required. For JavaScript changes, programmers should be consulted. Fig. 5: Google recommends installing the Google Chrome browser plugin for operating the visual editor of Google Optimize. To work with the visual editor, a browser plugin must first be installed. Google Chrome checks for the plugin and, as seen in Figure 5, suggests installation if necessary. Once the plugin is installed, the user can open the website and make adjustments. Figures 6 - 10 show how users can make adjustments: Fig. 6: By hovering the mouse, users select a webpage element. The individual elements are directly marked and highlighted by Google Optimize. Fig. 7: Changes to the selected element can then be made using the visual editor at the bottom right of the screen. Fig. 8: By clicking on "Edit Element" in the visual editor, further options can be selected, in this case, "Edit text". Fig. 9: Subsequently, the text of the H1 headline can be easily modified. Fig. 10: The bar at the top of the screen shows, among other things, which element the user is in (H1), how many changes have been made, and how each change appears on different device types such as desktop, tablet, and smartphone. Once the desired changes are made and saved, the appearance and behavior of the changes must be checked on each device. Quick errors may occur due to individual programming that can be avoided through extended checks. Linking Google Optimize with Google Analytics In the next step, Google Optimize must be linked with Google Analytics . For this purpose, the user selects a data view within the desired Google Analytics property. The user behavior data of this data view is then used to evaluate the test. This way, changes in bounce rates, the number of transactions, etc., can be considered in the experiment. Integration of Google Optimize via the Google Tag Manager In its developer area for Google Optimize, Google recommends using a modified Google Analytics code. This loads faster and prevents screen flickering caused by the dynamically made changes to the website. As a result, the user does not see upon page load that they are being shown a different variant. However, Google Optimize can also be integrated via the Google Tag Manager. In this case, the Google Tag Manager code should be placed as high up in the source code as possible. This is necessary to avoid possible screen flickering. The execution order of the different codes upon a page load is as follows: The user loads the page The Google Tag Manager code is executed The Google Optimize code is executed The Google Analytics code is executed Due to the use of the Google Tag Manager, there is a delay in execution, as Optimize can only be executed once the Tag Manager is loaded and executed. This is not the case when using a modified Google Analytics code, and the Google Optimize code can be executed immediately. As a result, on pages with many images or resources to load, the mentioned screen flickering can be reduced or completely avoided. In any case, Google Analytics should only be executed after the Google Optimize code, regardless of the use of the Google Tag Manager. A corresponding configuration must be set in the advanced settings of all Google Analytics tags on the website. Figure 11 shows the configuration of a Google Optimize tag in the Google Tag Manager. In this tag, essentially only the property ID of Google Analytics (also known as the UA number) and the container ID of the Google Optimize experiment must be entered. To comply with data protection, the IP address should also be anonymized. The trigger should be configured as precisely as possible to trigger the tag only when the corresponding page is called or the corresponding event is triggered. Fig. 11: Configuration of a Google Optimize tag in the Google Tag Manager. Defining the Target Audience and Timeframe Once the Google Optimize tag is published in the Google Tag Manager or the Google Optimize snippet is embedded in the page's source code, the further configuration of the experiment can proceed. For this, the target audience and duration of the test must be defined. In the free version, only the number of users participating in the test can be selected for the target audience. Currently, granular settings related to the target audience, such as age, gender, source of access, etc., are not possible. The timeframe can be individually set. Alternatively, the test can be started immediately and, if desired, ended immediately. Figure 12 shows the currently possible settings. Fig. 12: : Target Audience settings in Google Optimize. After an experiment, marketers can evaluate the results in the reporting area. Here, the data is displayed both in total and for each variation separately. In addition to standard data such as page views, the winner of the test is also displayed, including the improvements collected over the baseline. The individual variations are also checked against the set goals. This allows, in conjunction with further Google Analytics data, easy identification of which day and time each variation improved. Fig. 13: Evaluation of the results of an experiment. Source: https://support.google.com/360suite/optimize/answer/6218117?hl=en. What We Can Do for You Would you like to test individual elements of your website to increase conversions, leads, or transactions? Start with the free version of Google Optimize to quickly and easily conduct individual experiments. We are happy to advise you on the implementation and evaluation of corresponding tests for conversion optimization, in connection with your set website goals. Contact us.
What clients can expect from your agency in web analytics support
Feb 2, 2017

Thorsten
Abrahamczik
Category:
Web Analytics

Web analytics, what exactly is it? For many of our clients, web analytics means using a tool that receives data that is barely interpretable and possibly invalid. The operation quickly becomes overwhelming, so the tool is used only once every few months. This happens whenever management wants to see data about the website or when it's necessary to justify why the planned marketing budget for the next year is so high. But web analytics is much more than that! Web analytics means questioning interactions on your own website, evaluating the behavior of individual target groups, creating customer journeys, and much more. Ultimately, from all these evaluations and results, an action plan should be created that enables you to specifically optimize your website. In this article, we would like to explain what you, as an online marketer, can expect from an agency that supports you in the area of web analytics. We will discuss our experiences and explain how we approach the topic of web analytics with a client. Fig. 1: Cross-device web analytics | Source: http://bit.ly/2kxlWF4 Phase 1 – Reviewing the Status Quo in Tracking If you approach us as a potential new customer to talk about web analytics, we first describe what web analytics means to us: Web analytics is the foundation of all online marketing activities and provides the basis for all marketing-related activities, both online and offline. It collects both quantitative and qualitative data. When used properly, web analytics describes what happens due to customer interactions and, most importantly, why these interactions occur. Furthermore, it places your own data, if available in benchmarks, in relation to the data of other companies or industries. This shows where you can improve as a company. But more than anything, web analytics is one thing: continuous. Web analytics is very comprehensive and requires intensive support. However, it is not rocket science and can be easily implemented. To ensure this, we first check your current tracking implementation. This way, we obtain a status quo of your web analytics and can better assess where problems and potentials exist. Reviewing the Configuration of the Web Analytics Tool First, we go into the tracking tool and check which data is flowing in, how the tool is configured, and whether the data is valid. Most of our clients use Google Analytics as their tool, making the verification relatively straightforward and almost standardized. Nevertheless, we also look for peculiarities in the data and configuration. Examples include: Are filters being used correctly? Is there spam in the data that distorts evaluations? Is Google Analytics linked with other services like AdWords, Search Console, etc.? Are the data from internal searches being collected? Are demographic data activated? Have goals been set up in Google Analytics? As you can see, there is much to consider when configuring Google Analytics. Especially if you feel uncertain as a marketer and cannot precisely assess which setting causes which impact, key figures can easily be collected or interpreted incorrectly. A classic example is excluding your own accesses through an IP address filter. When we check the filter configurations in Google Analytics, it is incorrectly set up and does not function in 95% of the cases. Our staff member Bettina Wille has written an extensive article about what you need to watch out for when configuring filters in Google Analytics. We also check if you are using reports. These are generally represented in dashboards, radar events, or custom reports, so verification is easily feasible. Checking the Technical Implementation in the Source Code Once we know how your Google Analytics is configured, we check the implementation in the source code. This is not only about seeing whether the tracking code is implemented but also about how it is implemented. Below is a sample selection of aspects we check: Which version of the Google Analytics tracking have you implemented? Has the tracking code been fully implemented according to the configured Google Analytics settings? Are there pages where the tracking code is not implemented? Are multiple tracking code implementations present so that data is collected twice? Are specialties such as cross-domain tracking, e-commerce tracking, etc., implemented correctly? Are additional tracking features being used, e.g., custom dimensions, event tracking, etc.? Are referrers being correctly passed on and is direct traffic indeed just direct traffic? Reviewing Data Protection Finally, we check in individual areas whether your Google Analytics is installed in compliance with data protection regulations. Please note, however, that we are not a law firm and, therefore, cannot provide legally binding advice. Nonetheless, there are features that can be easily checked, such as: Are you anonymizing the user's IP address? Do you have a privacy page that mentions the use of Google Analytics? Is there an option for users to opt out of tracking, both via a browser plugin and a functioning opt-out cookie? Have you signed a data processing agreement with Google? Phase 2 – Developing a Tracking Concept In this article, we have so far focused exclusively on implementing with Google Analytics. Of course, there are also other types of web analytics with additional tools for things like A/B testing, surveys, etc., which also need to be reviewed. However, we do not wish to go into these tools individually in this article. After we have worked out a precise overview of your current tracking and activities in the area of web analytics, we move on to developing a tracking concept. In this, we describe your current status quo and define in great detail what tracking should be used in the future. We also determine jointly with you which data should be collected beyond standard tracking, which website goals should be examined, which KPIs should be defined, and how the configuration should be implemented. Figure 2 shows an excerpt from the table of contents of a tracking concept: Fig. 2: Together with you, we develop a tracking concept. Analysis and Research of Tracking Opportunities by the Agency At the beginning, we explore, based on the status quo analysis, possibilities for enhanced tracking on your website. It is important for us to only collect data that is valuable to you. We can also recommend tracking where we track everything and nothing, so to speak. In this case, we collect a lot of data, which leads to the point where you can no longer analyze it because the data volume is simply too large. Of course, that is not our goal. From our perspective, it requires a precise measure where you can work well with the key figures and receive all the information you need for your evaluations. Aligning Tracking Goals with the Client For the reasons mentioned above, we engage in very intensive exchanges with you during this phase. Here, there are many conversations with different people in your company to understand your requirements for web analytics better. Some of the topics we discuss with you include: Company goals Website objectives KPIs already in use Previous internal reports Hierarchy levels in the company including different reports for different contacts Cooperations with third-party providers Differences between reporting and web analytics Motivation for using Google Analytics Defining KPIs and Goals for the Web Analytics Through these discussions, we not only learn about your requirements but can also better assess which topics in web analytics are particularly interesting for you. Based on this, we can provide targeted recommendations for tracking implementation, KPIs, goals, etc. At this point, you do not yet know our final draft/proposal. Discussion of the Draft Concept with the Client and Approval from the Client Once we have worked out your tracking concept, there is a joint meeting with you in which we discuss the tracking concept with you in detail. This is especially important as the concept is partly very technical. However, it is essential that you completely understand the concept. If you have any requests for changes at this stage, we will discuss them and, if necessary, incorporate them into the concept. Phase 3 – Implementation of the New Tracking Once you have approved the tracking concept, we begin implementing the tracking. Here, we always start with the technical implementation before starting the configuration of Google Analytics. Using the Latest Technologies When we receive an order to implement tracking, we naturally always use the latest technologies. This includes using the Google Tag Manager as well as Universal Analytics. If specific reports are desired, we recommend the client use Google Data Studio. Google Tag Manager As we have already written in previous articles about the Google Tag Manager, the entire tag management can be easily handled, both for Google Analytics and for other tools like AdWords, third-party providers, etc. A major advantage of the Tag Manager is that in very few cases does the IT need to adapt the source code. The tasks associated with managing tags can then be carried out directly by the marketer in the tool. We have already described the functionality of the Google Tag Manager in a comprehensive article. Universal Analytics Universal Analytics is the current version of Google Analytics. The data collection of Universal Analytics differs slightly from the old asynchronous Google Analytics and offers additional benefits such as Enhanced E-Commerce, UserID, etc. Google announced in 2015 that they would cease to support asynchronous tracking in the future. When support will be discontinued is still unclear. Data Studio and Other Offerings With the Google Data Studio, another tool from the Google 360° Suite has been made available as a free version. Since the beginning of February, the tool allows you to create as many reports per email address as you like, the only limitation being that the Double-Click connector cannot be used in the free version. Otherwise, it has the same functionality as the paid version of the 360° Suite. Particularly interesting for marketers is that the reports can be provided with their own CI. This is especially useful when the reports are forwarded to management. Collaboration with IT or the IT Service Provider Once we start implementation, usually only a few adjustments need to be made to the source code. The old tracking remains intact for the time being. This ensures that the main tracking is not affected. For all changes to the source code, the IT or service provider receives precise instructions from us on what needs to be changed. Generally, the Google Tag Manager code needs to be embedded, but often also the opt-out cookie. If e-commerce is being used or conversion values should be dynamically transmitted from the website, adjustments are also required for this. The entire configuration of your new tracking is then implemented by us in the Google Tag Manager. This means we set up the Universal Analytics tag and create variables, triggers, and other tags to fulfill the specifications from the tracking concept. If we also support the client in other online marketing areas like Google AdWords, we implement this directly in the Google Tag Manager as well. This allows for easier management in the future. Throughout the entire phase, we work closely with you and your IT to ensure a correct implementation. If questions arise on the IT side, we offer advisory support. If we maintain your website/content management system, the adjustments are naturally carried out by us. Configuration of the Used Tools With a delay to the basic setup of Universal Analytics in the Google Tag Manager, we set up a so-called test property in Google Analytics that we use to test the data we collect. If we did not do this, we would have to direct the new data into your main property. This would lead to data distortion and problems with clarity that we want to avoid. Only when we know that all data is correctly transmitted from the Google Tag Manager to Google Analytics do we configure the Google Analytics of the main property. This is because some configurations are only possible when certain tags are configured in the Google Tag Manager. Once the data is correctly transmitted by the Google Tag Manager, and Google Analytics is correctly configured, we inform IT that they can remove the old tracking code. Once this has occurred, we adjust the tracking in the Google Tag Manager so that the data no longer flows into the test property but into your main property. This way, we ensure that your old metrics in Google Analytics are not lost and that you can compare your new data with the old data. It should be noted that your old data may not be valid. Setting Up Reporting Part of the configuration of Google Analytics and Google Data Studio may include setting up reports. These are created according to the specifications of the tracking concept. Client Training in the Use of the Tools Once the entire tracking has been implemented, you need to understand and be able to apply the entire implementation independently. For this reason, we conduct an introduction to the implementation, depending on your previous knowledge. This includes all tools (Google Tag Manager, Google Analytics, and Google Data Studio). This is purely an introduction to the implementation. If you as a client have very little experience in Google Analytics and, for example, no experience with Google Tag Manager or Google Data Studio, we recommend a thorough web analytics training. In this, we explain to you not only all the tools but also the basics of web analytics. Depending on the scope, this lasts between one and two days. Learn more about our Google Analytics training sessions. Phase 4 – Ongoing Support So that the new metrics from Google Analytics don't go unused after setup, we support our clients after implementation with ongoing support. This is generally done in two ways. On the one hand, we ensure that the data will continue to be collected correctly. On the other hand, we conduct more in-depth analyses. Regular Analyses and Creation of Evaluations by the Agency This allows you to focus on your reports while we conduct comprehensive web analytics for you. Here, we focus specifically on individual subareas. For example, we analyze the behavior of individual target groups, examine the behavior of users in specific areas of the website, or, alongside a custom channel attribution, develop a basis for an attribution model. In a second step, we further develop this together with you so that you can understand exactly how users come into contact with your company and how you might optimize individual channels to target users more precisely. We also support you in areas such as A/B and multivariate testing. As these tasks are time-consuming but also very productive, our clients gladly take advantage of our expertise and leave these tasks to us. Continuous Support in Case of Problems This topic primarily involves technical support. It often happens that errors in data collection occur due to website adjustments. This happens, for example, through renaming individual buttons, removing website elements, or a complete adjustment of a website element. Generally, IT doesn’t consider during implementation that changes might affect tracking. These errors usually become apparent to us quickly, and we can correct them in consultation with you. What Can We Do for You? Would you like to check your tracking, significantly expand it, or simply outsource the evaluation of the data to a third party? Do you want to conduct targeted optimizations with your web analytics and thus increase your leads? Contact us , and we will be happy to advise you on how we can optimally collaborate with you.
Check Google Analytics Implementation with Screaming Frog
Sep 15, 2016

Thorsten
Abrahamczik
Category:
Web Analytics

Modern websites and content management systems have become very complex, making it challenging for users to make technical adjustments. From our day-to-day business, we know that online marketers often encounter issues, especially with tracking. But only through valid tracking can you generate clean data in the analytics tools. Only then are qualitative analyses possible. In this article, we present a method that allows you to easily check whether the Google Analytics code is installed on all pages. Furthermore, we will explain how to verify the code implementation. Screaming Frog – The Tool for SEO A well-known SEO tool is the Screaming Frog program. The main task of the tool is to crawl websites. For this purpose, a crawler (also referred to as a bot) is sent to the respective website to gather information on all subpages and their contents. Marketers can thus easily check to what extent the website has potential for SEO. In the free version, up to 500 website elements (HTML pages, images, CSS files, etc.) can be analyzed. The paid version is much more powerful and crawls all pages. Besides standard reporting, the tool offers the possibility to connect with external services such as Google Search Console and Google Analytics to obtain even more accurate analyses. Additionally, it also provides the option to conduct investigations on the website with custom filters. Essentially, this involves two different methods: Search: The affected pages are listed with URLs in Screaming Frog. Extraction: In this case, the desired content of the affected pages is displayed by Screaming Frog. Configuration of Screaming Frog Figure 1 shows Screaming Frog immediately after opening the program. By typing a URL in the bar and clicking on Start, the corresponding page or domain is immediately crawled, and initial information about the URLs flows into almost all tabs. The "Custom" tab will be important for you later. Fig. 1: The structure of Screaming Frog Identifying Pages Where the Google Analytics Code Is Missing If you want to identify pages or URLs where, for example, the Google Analytics code is not embedded, click on the "Search" button as shown in Figure 2. Fig. 2: Using custom filters in Screaming Frog In the subsequent dialog, Screaming Frog offers you ten different filters. To search for pages where Google Analytics is not installed, you only need one filter. Set the first field to "Does Not Contain" and enter the UA number of your Google Analytics property. You can also make other inputs there, but it's important that it’s something unique from the Google Analytics code. Also ensure that the entry does not match any other element in the source code. From our point of view, the Google Analytics UA number is a good choice. Fig. 3: Using the search filter in Screaming Frog If you want to check whether the Google Tag Manager is correctly embedded, you could enter "GTM-XXXXXX" as the container's ID. In this case, you are also using an element unique to the Google Tag Manager. Subsequently, under the "Custom" tab, all pages where the corresponding search term is not found will be listed. This way, you can easily identify which pages still need adjustments to achieve complete tracking. You can also go to IT with a specific action plan in this manner. Checking the Google Analytics Property ID on All Pages In the event that Google Analytics is installed on all web pages, you should additionally verify that the correct UA number is used. Mistakes can easily occur, and then you also don't have valid tracking. The "Extraction" method is perfect for this. Click on "Extraction" as shown in Figure 2. In the following dialog (Figure 4), enter "Google Analytics UA number" as an example in the first field. This field is used to name the corresponding column in the "Custom" tab. Fig. 4: Setting an extraction filter to read the value of the Google Analytics UA number In the second field, select "Regex". Regex is short for "regular expressions" and provides the opportunity to identify exact letter, number, and character combinations within a given text area, e.g., the source code of a webpage. To find Google Analytics elements on the page, use the following regex: ["'](UA-.*?)["'] This way, you can see which Google Analytics UA number is used for each page of your domain. You can easily spot any typos and correct them afterward. For the Google Tag Manager, you would use "["'](GTM-.*?)["']" as a regex filter in this case. In the next figure, you can see how the results are then displayed in the "Custom" tab. Fig. 5: "Custom" tab Conclusion Screaming Frog offers strong expandability with custom filters besides its diverse SEO analysis capabilities. Here, users can also check non-SEO content and gain insights that greatly simplify daily work. What Can We Do for You? Are you unsure if Google Analytics is correctly embedded on your site or wonder if Google Analytics tracking can be extended beyond standard tracking? Do you want to measure conversions even more accurately? Contact us, and we will be happy to advise you on checking existing tracking, creating a tracking concept, or implementing tracking. We look forward to your inquiry .
Set filters in Google Analytics
Jul 28, 2016

Thorsten
Abrahamczik
Category:
Web Analytics

Google Analytics is the most popular web analysis tool. As a website operator, you can understand the behavior of your users on the site and subsequently define appropriate measures for page optimization. To ensure that you only find relevant data in your evaluation, it's wise to set certain filters. Why are filters useful in Google Analytics? Google Analytics captures all user data on the website without filtering. However, it is not always 100% effective to use all values for an analysis. With the help of filters, you can initially channel the data through a kind of sieve so that your data view only shows the values you need. Generally, filters can be categorized into the following main categories: Inclusive filter: only a defined filter pattern gets through Exclusion filter: the defined filter pattern is excluded Search & Replace: designations recorded by Google (e.g., Homepage = "/") are rewritten Lowercase: Reduction of duplicate results by using all lowercase letters A common example of setting a filter is the exclusion of an IP address range. This allows you to exclude data from your company or agency network if it uses a static IP. The same applies to your IT service provider who regularly visits the site with a static IP address, without it being relevant for site optimization. This filter is not possible with dynamic IP addresses as they change every 24 hours. Access from spam or bots can partly be excluded by filters. Many referral URLs are now known and published on portals that go to the site as spam and skew your traffic. Check the referrals that lead users to your website. It is typical to see 1 page/session, 0:00 time on site, and an almost 100% bounce rate. When researching, you can get an overview of the referral URLs via the left navigation bar under Acquisition, All Traffic, Referrals. Fig. 1: Display referrals in Google Analytics But you can also detect spam behavior in the reports of Google Analytics under "Locations", "Website Content", and "All Traffic". A common example of spam access is the following: Fig. 2: Example of spam access How to exclude the IP address range for your corporate network If you want to check user behavior with the web analysis tool Google Analytics, accesses from your own network are obstructive. Internal page hits should not be equated with the character of a browsing user. To prevent accesses from your own ranks from appearing in your data, set a filter as follows: Navigate to your Google Analytics account and open the admin view: Fig. 3: Admin view in Google Analytics In the right column of the data view, you will already see the "Filter" field. Here you can now add a new filter. Configure the filter. Give it a meaningful name, select "predefined" as the filter type, and use an "Exclude" filter. Choose the "begins with" command and enter the start of the IP address to be excluded in the last field. You should never use the "equals" command. Due to the data protection-compliant anonymization of the IP address, the fourth/last part (the last 3 digits) of the IP address is always truncated, so the complete sequence of numbers never reaches Analytics. Fig. 4: Filter to exclude IP addresses How to exclude referral URLs If you found out through checking your referral URLs that certain accesses are done by bots and thus flow into your analysis as spam, you should create a filter to exclude this data. First, collect a list of these URLs or research a list and also create a custom "Exclude" filter. Give it a meaningful designation and use the referral as the filter field. Then enter all the researched URLs into the filter pattern. Separate the individual URLs with a pipe (|). Add a backslash () before EACH dot, so the dot does not execute a command as a regular expression. Avoid spaces in the filter pattern entirely. Unfortunately, Google Analytics provides only 255 characters in the filter pattern, so you may need to set up multiple filters to exclude referrals. Test filters before applying Once you've set the filters, they are immediately active. However, you cannot regain filtered data or reverse the sieving effect. We therefore recommend testing the filters in a test view before applying them to the live environment. Also use a third data view that collects all your website data without filters. This way, you can still reconstruct the correct values in case of possible errors. How you should configure the data views can be found in our article Google Analytics Basic Configuration – What You Should Pay Attention To . What we can do for you The proper use of filters cleans up your data so that you can draw meaningful results from analysis with Google. If you need support with Web Analysis , we look forward to your inquiry.
The recording feature of Google Tag Assistant
Jun 30, 2016

Thorsten
Abrahamczik
Category:
Web Analytics

Many companies incorporate Google Analytics, a free web analysis tool, into their website to get an overview of their website's visitor numbers. However, this is often where the problems begin. In many cases, the tool or its code is simply copied into the page's source code. This does not take into consideration that the Google Analytics code must be placed in a specific spot, nor that the IP address needs to be anonymized. It is also seldom considered that in certain cases the referrer, meaning the reference/link through which a user came to your site, is not correctly passed on. In this case, the website visitor is recorded as a direct entry and not as a visit via a referral. Therefore, users are working with skewed numbers. To identify and avoid these issues, Google offers a practical browser extension for Chrome called Google Tag Assistant, as mentioned in the blog article The Best Tools for Successful Entry into Web Analysis . The Functions of Google Tag Assistant First, Google Tag Assistant provides the ability to display information about all Google tags. These can typically include the following tags, as shown in Screenshot 1, which also shows the button for the record function. Google Analytics, Remarketing, Conversions Tracking, DoubleClick, etc. Figure 1: Overview of Google Tag Assistant For each individual tag, you can display detailed information about the current status of the tag. A color-coding system further indicates whether the tag is correctly implemented (green), if there are slight deviations from the implementation recommendations (blue or yellow), or if there are significant problems causing errors in tag execution (red). The level of detail in the provided information can be configured in the Tag Assistant settings for each tag type. The Record Function As illustrated in screenshots one and two, a record function has been added to the tool with Google Analytics in mind, which allows for precise analyses across multiple page views. This enables you as a user to easily determine whether your data is being correctly captured and processed for Google Analytics. The advantage of this method: data is measured individually for all executed Google Analytics tags. In addition to page views, you can, for example, also examine event tracking or e-commerce tracking. This way, you can find errors that you would otherwise only uncover with significantly greater effort. Figure 2: Recording has begun Using this, we were able to discover an error in a customer's cross-domain tracking. It was correctly configured for page view tracking, but not for several event tags. This led to the referrer being correctly passed during page views of the second domain, but not for any event tags, where the fallback "Direct/None" was used. This verifiably distorted the metrics in Google Analytics. After a corresponding adjustment to the event tags, the appropriate referrers were correctly passed on, and the metrics were correctly integrated into Google Analytics once more. An advantageous feature is the ability to directly link Tag Assistant recordings with Google Analytics. In the analysis area of Google Tag Assistant, you can selectively choose individual data views of the Google Analytics account. You can also send specific location data to check whether certain data, for instance due to IP address filters, does not integrate into Google Analytics. In such cases, you can make the necessary adjustments in your Google Analytics configuration and immediately update the recording with a click of the refresh button. A new recording is not needed to see the results of the adjustments immediately. Screenshot 3 shows a section of the analysis area: Figure 3: Overview of the Google Analytics Record Function Analysis in Google Tag Assistant What We Can Do for You At internetwarriors, the Google Tag Assistant with its record function is one of the standard tools in the field of web analysis and SEA. We would be happy to review your Google implementation as well and identify any potential errors. If desired, we can also fix them. Improve the quality of your analyses with more accurate metrics and, correspondingly, your marketing budget allocation. Contact us.
Show more posts
Paid landing pages – what should you pay attention to? Tips, tricks, etc.
Apr 29, 2026

Josephine
Treuter
Category:
Search Engine Advertising

A strong ad is only half the battle: only the right landing page determines whether a click actually turns into a conversion. If you invest in Google Ads, Meta, or LinkedIn, you should pay at least as much attention to the landing page as you do to the ad creative. In this article, we’ll show what makes a successful paid landing page, which components are essential, and which tips and tricks you can use to get the most out of your campaigns. The key points at a glance A paid landing page (also called a conversion page or PPC landing page) is a page created specifically for paid advertising campaigns with a clear conversion goal. Unlike a classic website, it avoids distracting navigation and focuses on a single action, such as a purchase, a signup, or lead generation. Successful campaign pages convince with a clear headline, a strong USP, trust-building elements, and a prominent call to action. Mobile optimization, short loading times, and consistent message match between the ad and the landing page determine success or failure. A/B testing and clean tracking are essential for continuously improving performance. What is a paid landing page? A paid landing page, often also referred to as a campaign page, conversion page, or PPC landing page, is a website that is designed specifically for a paid advertising campaign. Unlike a classic homepage, it pursues one single goal: to turn visitors who arrive via a Google Ads, Meta, LinkedIn, or other paid ad into customers or leads. The term "paid" refers to the traffic source. Unlike organically reached users who come to the page via search engines, social media posts, or recommendations, visitors arrive at the landing page exclusively through paid ads. Every click costs money, which is exactly why the page must be designed so that this click reliably leads to an action. The difference from a classic website While a company website covers many topics and serves different target groups, a landing page is minimalist and purpose-driven. There is no main navigation, no distracting links, and no unnecessary content. Everything on the page works toward one single call to action, whether that is a purchase, filling out a form, or a download. The two formats also differ significantly when it comes to measuring success. While a company website is measured by metrics such as sessions, time on site, or page views, a landing page is practically judged by just one metric: the conversion rate. Every element on the page, from the image to the headline to the button text, is consistently aligned with that goal. Why do you need a dedicated landing page for paid campaigns? When you run ads, you pay for every click, regardless of whether it leads to a conversion. If you simply send visitors to the homepage, a lot of potential is often lost: the ad message is not picked up, users get lost in the navigation, and leave the page. A dedicated lead landing page ensures that the promise made in the ad is delivered immediately. Specific campaign pages usually achieve significantly higher conversion rates than general websites. In addition, advertising platforms such as Google Ads reward relevance with better quality scores, which in turn lowers click prices and makes the ad budget more efficient. The most important building blocks of a successful landing page A good conversion page follows a clear structure. These elements should never be missing: Clear headline and convincing USP: The headline is the first thing visitors see, and within seconds they decide whether to stay or click away. It must clearly communicate which problem is being solved or which benefit awaits. Directly below it, a subheadline specifies the unique selling point. Convincing visuals: Images and videos convey messages faster than text. Authentic photos have more impact than generic stock images, and product videos or explainer clips can noticeably increase the conversion rate. A prominent call to action: The CTA button is the centerpiece of every campaign page. It should stand out visually, be clearly worded ("Try it free now", "Book a consultation") and ideally appear multiple times on the page without being pushy. Build in trust elements: Trust is the decisive factor, especially when the brand is new to visitors. Customer testimonials, reviews, seals of approval, well-known reference logos, or awards work wonders. Transparent information about privacy and delivery terms also lowers barriers. Mobile optimization and short loading times: More than half of all paid clicks now come from mobile devices. A landing page must work just as well on a smartphone as it does on desktop. Loading times over three seconds lead to massive drop-offs — every additional second can reduce the conversion rate by double-digit percentages. Tips & tricks for more conversions: With a few targeted adjustments, a good landing page can become a truly strong one. Message match: the ad and landing page must align: If an ad promises a free demo, that demo must be shown prominently on the landing page as well. The so-called message match — meaning the content and visual alignment between the ad and the destination page — is one of the biggest levers for higher conversion rates. A/B testing as a must: Even small changes can have a big impact: a different headline, a new button color, another image. A/B tests help you find out which version actually performs better instead of relying on gut feeling. Set up clean tracking: Without valid data, nothing can be optimized. Conversion tracking, heatmaps, and session recordings show what works on the page and where visitors drop off. Tools like Google Tag Manager, GA4, or Hotjar provide valuable insights for this purpose. Keep forms as short as possible: Every additional field costs conversions. Only ask for what is truly needed. On a lead landing page, name, email address, and one or two specific details for later qualification are often enough. Avoid common mistakes on campaign pages: Many companies underestimate how quickly a landing page can fail. Classic pitfalls include too much text, unclear CTAs, missing mobile optimization, the wrong target audience, or landing pages that are simply copies of the homepage. Missing trust elements or insufficient GDPR notices also have a negative impact. It is also problematic to launch paid campaigns without preparing a matching destination page. If you want to appear professional and not burn through your ad budget, you should create a dedicated page for each campaign, or at least for each main target group. Conclusion: paid landing pages are not a nice-to-have A well-thought-out landing page is the decisive lever between click and conversion. It saves ad budget, boosts the performance of your campaigns, and creates a professional brand experience. Anyone investing in paid channels should therefore pay at least as much attention to the destination page as to the ad itself, because even the best campaign is useless if the landing page does not convince. At the same time, a landing page is never truly "finished." User behavior, platform algorithms, and the competitive environment are constantly changing, which is why successful companies treat their campaign pages as an ongoing optimization process. Anyone who thinks strategically from the start and aligns headline, visuals, CTA, trust elements, and tracking properly can turn expensive traffic into profitable customer relationships — and turn an average paid campaign into a truly successful one. FAQ What is the difference between a landing page and a campaign page? The terms are often used synonymously. A campaign page is a specific type of landing page created for a particular marketing campaign, such as a product launch or a time-limited promotion. Do I need a separate landing page for every ad? Ideally, yes — at least for each target group or offer. The more closely the page matches the ad content, the higher the conversion rate and the better the quality score on platforms like Google Ads. How long should a PPC landing page be? That depends on the offer. Simple lead generation works with short pages, while products that require more explanation or higher-priced offers need more content, arguments, and trust elements. How do I measure the success of a conversion page? By clearly defined KPIs such as conversion rate, cost per conversion, bounce rate, and time on page. Tools like GA4, Google Ads, and heatmap software provide the data needed for a solid evaluation.
AI Mode and AI Overview in Google Ads – What should you keep in mind?
Apr 22, 2026

Markus
Brook
Category:
Search Engine Advertising

The key points at a glance Google has fundamentally changed: Instead of blue links, AI-generated answers dominate the search results page — with direct effects on Google Ads. AI Overviews have been active in Germany since spring 2025. Ads can already appear above, below, and in some cases within the AI responses. Ads directly in Google AI Mode are currently being tested in the US and will soon also come to Germany. Only certain campaign types qualify for these new placements — above all Broad Match, AI Max for Search, Performance Max and Shopping Ads . Anyone who still works exclusively with Exact Match or a rigid campaign structure today will lose visibility in the future exactly at the moments that matter. AI Max for Search is currently the fastest-growing AI feature in Google Ads and a key lever for the new placements. Anyone who optimizes their campaign structure, data quality and assets now will secure a decisive head start. Search has fundamentally changed Anyone searching on Google today increasingly gets not a list of links, but a direct answer. The search results page advertisers have grown used to over the years looks fundamentally different in 2026 than it did just two years ago. Two technologies are driving this change: AI Overviews are AI-generated summaries that have also been active in Germany since spring 2025. They appear at the top of the page for more complex or informational search queries and often answer the question so completely that many users do not scroll any further. This changes where and how ads are perceived and which ones are served at all. Google AI Mode has taken things a step further. Available in Germany since October 2025, it is a standalone, conversational search interface. Users no longer type in individual search terms, but have real dialogues, similar to an AI assistant. The intent behind them is often much more layered, the context more complex. For Google Ads advertisers, this means: Reaching the right audience no longer depends only on precise keywords, but on understanding intent, context and conversation flow. The AI decides and it decides based on data and signals, not manually maintained keyword lists. Where do ads actually appear — and which campaigns qualify? This is the most practical question advertisers ask: Where exactly do my ads appear, and what do I need to do for that? In AI Overviews Ads can appear in three places around an AI Overview: above, below, or directly within the AI answer. Placement above and below is already available in all markets where AI Overviews are active, including Germany. Integration directly into the answer text is currently limited to English-language markets. Important to understand: There is no separate opt-in for these placements. If you use the right campaign types and have relevant ads, you are automatically considered. Just as little can this placement be specifically excluded. Google evaluates both the actual search query and the content of the AI-generated answer to decide whether an ad fits. This is a key difference from classic keyword logic: relevance is now measured in the context of the entire answer, not just the individual search term. In Google AI Mode Tests are currently running here in the US. Ads appear there directly embedded in the conversational responses — not as separate blocks, but as an integrated part of the AI answer. This is an even tighter context than with AI Overviews. The global rollout, including for Germany, has been announced, but no specific date has been set yet. Which campaign types are actually qualified? This is the point where many advertisers get stuck. Not every campaign is automatically served in AI Overviews or AI Mode. Google has clearly defined which campaign types qualify: Search Ads with Broad Match keywords AI Max for Search Performance Max (PMax) Shopping Ads Campaigns that work exclusively with Exact Match or Phrase Match are not qualified for these placements. This is a structural turning point: anyone who still relies on hyper-granular keyword structures today will, over time, lose impression share exactly at the moments when users are most ready to buy. AI Max for Search: What is behind it and why is it so relevant right now? AI Max in Google Ads is not a new campaign type, but a feature package that can be integrated into existing search campaigns. Activated with one click in the campaign settings, it fundamentally changes the campaign logic. Specifically, AI Max combines two approaches: first, the familiar Broad Match technology, which also matches search queries when the exact wording differs from the entered keywords. Second, so-called keywordless serving — similar to Dynamic Search Ads in the past, but much smarter. The AI independently recognizes which search queries an ad would be thematically relevant for, even without a stored keyword. To this are added three other core features: Automated text adaptation: Google generates new headlines and descriptions based on existing ad titles, descriptions, and landing page content — and selects in real time the combination that best fits the respective search query. Since February 2026, text guidelines have been available worldwide for all advertisers: there you can define which wording the AI may use and which it may not. URL expansion: Users are automatically sent to the page on your website that best matches the search query — not necessarily the URL stored in the campaign. Certain pages can be excluded from the system. Brand controls: Advertisers can define for which brands ads should appear and for which they should not. This is especially relevant for accounts that actively manage competitor or brand campaigns. When does AI Max pay off — and when does it not (yet)? AI Max shows its strengths above all in accounts that already have enough conversion data and target broad audiences. In e-commerce and with B2C products with high search volume, results are typically strongest. In niche markets, with very explanation-heavy B2B products, or accounts with only a few daily conversions, the rollout should be more cautious. An A/B test with a 50/50 split between the existing campaign and the AI Max version is the most sensible first step here. What applies in any case: the foundation has to be right. Clean conversion tracking, a data-driven attribution model, and clear conversion goals in the account are mandatory. Anyone activating AI Max without this foundation leaves the AI in charge without a map or compass. Performance Max: Google’s preferred channel for AI Overviews Performance Max is not new, but its role has shifted. Google increasingly sees PMax as the main format for serving in AI-driven surfaces. This is because PMax was built from the ground up for data-driven, cross-channel serving: it provides the AI with text, images, videos and audience signals, and leaves the optimal combination to it. For advertisers, this means: Anyone who has already set up PMax properly and regularly maintains asset groups is well positioned for AI Overviews and the AI Mode. Anyone not yet using it should start now at the latest — with clear goals, enough assets and regular monitoring of search terms. A good sign: PMax has become significantly more transparent in recent months. Negative keywords can now be added directly, and the channel reporting shows which channel (Search, YouTube, Display, Gmail, Discover) contributes what to performance — without additional scripts or workarounds. What this means for campaign structure Many accounts have grown historically: strict match type separation, single keyword ad groups, dozens of ad groups for minimal differences. That used to make sense to maintain control. Today, this structure works against the AI. If you split data across too many campaigns, you give the algorithm too little material to learn from. Instead of quickly recognizing patterns and optimizing, it stalls. The current approach that has proven effective in practice looks like this: topic-based campaigns with a manageable number of keywords, a combination of Exact and Broad Match, Smart Bidding as standard. Not maximally granular, but maximally data-dense. That does not mean giving up control completely. Negative keywords, audience signals, text guidelines and regular review of search queries remain active levers. The foundation: data quality decides Here is a mistake that runs through almost all accounts: people discuss campaign types and features before the data foundation is right. But the rule is: Garbage in, garbage out. If you feed the AI bad data, you are only automating budget burn. Server Side Tracking (SST) is the foundation. Classic browser tracking increasingly loses data due to ad blockers, cookie restrictions and iOS updates. Server Side Tracking bypasses these hurdles and, in practice, delivers at least 12% more usable data points — signals that Smart Bidding and AI Max urgently need for optimization. In addition, advertisers should actively use the following data sources: First-party data / customer lists : Existing and new customers can be evaluated differently in a targeted way via Customer Match lists. In the area of new customer acquisition, Smart Bidding can be prompted to weight new customers more heavily — with concrete effects on bid logic. CRM data (offline conversions) : Especially in B2B, it makes no sense to treat every lead equally. Anyone feeding back CRM data (e.g., from HubSpot or Salesforce) via offline conversions gives Google Ads the signal to distinguish between "poor" and "valuable" — and that is exactly the prerequisite for sustainably profitable growth. Conclusion: Act now before the market does Google Ads in 2026 is a data-driven system, not a manual tool. The question is no longer whether to use AI Max, AI Overviews and modern tracking structures — but when. Anyone who actively shapes the transformation now secures visibility at the moments that really matter. As an experienced Google Ads agency, we guide you through exactly this process: from tracking infrastructure to campaign structure to AI Max and Performance Max. Get in touch now → FAQ Will my Google Ads be served automatically in AI Overviews? Not automatically. Ads appear in AI Overviews when the ad matches both the search query and the content of the AI answer. Another requirement is that you use Broad Match, AI Max or Performance Max. What does advertising in Google AI Mode cost more than classic Search Ads? There is no separate pricing model for AI Mode ads. Google's auction system stays the same — placement is determined by relevance, quality score and bid. Can I exclude my ads from AI Overviews? No. Google currently does not offer a way to specifically disable these placements. Do I get separate reporting for AI Overview ads? Not yet in full. At present, ads in AI Overviews are counted as "Top Ads" and appear accordingly in standard reports. Dedicated segment reporting has been announced for the future, but is not yet available. When will ads in Google AI Mode also come to Germany? There is no official date yet. Ads in AI Mode are currently being tested in the US (as of March 2026). The international rollout has been announced. Does AI Max also make sense for smaller accounts? That depends on the individual case. In principle, AI Max needs a solid data foundation — meaning enough conversions, clean tracking and clear goals. For accounts with only a few daily conversions, we first recommend a controlled A/B test before the entire campaign is switched over. Do I need to create new campaigns to appear in AI Overviews? No. Existing campaigns qualify automatically, provided the right campaign types and match types are used. What is the difference between AI Overviews and AI Mode? AI Overviews are AI summaries within the normal Google search. AI Mode is a separate, conversational search interface for complex, multi-step queries — comparable to an AI chatbot directly in search.
Agentic Commerce & Agentic Shopping 2026: Why AI Shopping Agents are Rewriting Commerce
Mar 30, 2026

Moritz
Klussmann
Category:
Artificial Intelligence

The world of online marketing is spinning faster today than ever before. While we've been fighting for clicks and conversions at internetwarriors since 2001, we're currently experiencing the most radical upheaval in our history. The trigger: Agentic Commerce . We are transitioning from mere information search to task-oriented execution. Today, a user no longer just asks for products; they instruct a AI shopping agent to autonomously handle the entire purchase process. In this article, I'll show you why the failure of OpenAI's "Instant Checkout" is not the end of the hype, but the starting point for a new technical infrastructure that you need to know as a retailer now. The OpenAI Pivot: From Shopping Cart to Discovery Platform In March 2026, OpenAI ended its "Instant Checkout," prompting one of the most discussed debates in e-commerce. Failure or strategy? We reveal what is really behind the pivot and what it means for retailers. What was Instant Checkout? In September 2025, OpenAI launched the Agentic Commerce Protocol (ACP) with Stripe, bringing "Instant Checkout" to ChatGPT. The vision: users find a product in the chat and buy it directly without leaving the platform. Etsy, Walmart, and Shopify were the first partners – Shopify president Harley Finkelstein called it a "new frontier" for online retail. Why did direct checkout fail? In early March 2026, OpenAI pulled the plug. What critics dismiss as the failure of Agentic Commerce is, upon closer inspection, a strategic pivot from which we can learn a lot. OpenAI underestimated the immense complexity of global commerce. Three critical factors made direct purchase completion in the chatbot impossible: The three technical killers: 1. Lack of real-time synchronization: The inventory data of millions of retailers could not be reconciled at the required speed – outdated prices and stock immediately shattered user trust. 2. Compliance hurdles: Systems were missing for automated calculation of regional taxes (in the US alone, thousands of local tax jurisdictions) and for compliance with local laws like the Price Indication Regulation (PAngV) in Europe. 3. Fraud prevention: Agent-based transactions require completely new security architectures to prevent automated abuse. Another factor that is rarely mentioned in reporting: the withdrawal comes immediately after Amazon's $50 billion investment in OpenAI. Amazon controls 40 percent of US e-commerce and is building its own AI shopping tool with Rufus . Whether coincidence or strategic calculus – the timing is remarkable. 🟢 Update: March 25, 2026 OpenAI has simultaneously launched a completely new shopping experience with the checkout withdrawal: visual product browsing, side-by-side price comparisons, and image upload for product searches. Seven major US retailers – including Target, Sephora, Nordstrom, and Best Buy – are already live via ACP. Walmart operates a dedicated In-ChatGPT app with loyalty integration and native Walmart payment. This is not a withdrawal – this is a pivot. The new Warrior reality: OpenAI is primarily focusing on Product Discovery through ACP. The checkout returns to the retailer – but the decision of which retailer gets the order is increasingly made by the agent. Agentic Shopping works – just not yet in the West Anyone who believes that the failure of Instant Checkout proves Agentic Shopping is just hype is making a categorical mistake. Alibaba's Qwen-App is already completing food orders, travel bookings, and product purchases entirely in a single conversation – and at scale. The decisive difference: Alibaba owns the AI model, the marketplace, the payment infrastructure, and the logistics all from one source. OpenAI attempted to replicate the same without owning this stack. It was structurally doomed to fail. Google UCP: The new operating system of commerce While OpenAI is correcting, Google is creating facts with the Universal Commerce Protocol (UCP) . Unlike closed systems, UCP is an open standard that allows AI agents to communicate directly with merchants' backends – from discovery through checkout to post-purchase management. For you as a retailer, this means: Your Google Merchant Center (GMC) becomes the critical interface for AI in e-commerce . Google has introduced new attributes to make your products machine-readable: · product_faq – questions and answers directly extractable from the feed for AI agents · product_use_cases – specific scenarios in which your product offers the best solution · native_commerce – a switch signaling whether your product is ready for autonomous checkout The advantage for Germany: Google Merchant Center and Google AI Mode are already active in DACH. Retailers who optimize their feed now secure a real time advantage. SEO alone is no longer enough: Welcome to the era of GEO Our analysis of German e-commerce shops shows a clear picture: A top ranking in traditional search does not guarantee visibility in AI responses. Over 60 percent of URLs linked in AI overviews do not rank in the top 50 of traditional Google search. The rules have changed. This is where Generative Engine Optimization (GEO) comes into play – the discipline of optimizing content not for human clicks but for extraction by AI systems. Feature Classic SEO Generative Engine Optimization (GEO) Target Group Human users AI agents & Large Language Models Primary KPI Click-through rate (CTR) & rankings Mention rate & citation authority Content Logic Keywords & readability Semantic depth & fact density Technical Basis Crawlability & loading speed Structured data & API connectivity Success Measurement Google Search Console (rankings) Brand mentions in LLM responses Warriors Insight: In Germany, AI overviews already appear in 33 percent of all search queries. If you don't opt for GEO now, you will become invisible to the "agent customer" before they even arrive at a website. Strategic Warriors Knowledge: Brand power and the 95:5 rule In the Agentic Web, it's not just the keyword that counts anymore, but the authority of your brand as an "entity" – how a Large Language Model knows, categorizes, and recommends your brand. The 95:5 rule in B2B Only 5 percent of your target group is currently ready to buy (In-Market). The remaining 95 percent need to be reached through thought leadership and trust building in the long term. AI agents prefer brands that are anchored as expert entities in the knowledge graphs of Large Language Models. Those who only optimize for transactional keywords lose the majority of their potential customers before they are ready to buy. Preferred Sources: The Democratization of the Algorithm Google now allows users to actively mark their preferred sources. These "Preferred Sources" receive a permanent visibility boost – regardless of algorithm updates. This fundamentally changes the game: Trust is the new currency. You must persuade users to actively choose your brand as trustworthy – not just ranking well. Checklist: Make your shop agent-ready now For German retailers, the groundwork begins today, even though fully autonomous Agentic Shopping in DACH is still 12–24 months away. Product data excellence in Merchant Center: Maintain GTINs, precise attributes, and new UCP fields (product_faq, product_use_cases). A flawed feed is the largest KI visibility obstacle you can control yourself. Technical infrastructure for AI agents: Implement an llms.txt file (the robots.txt for AI crawlers) and consistently use JSON-LD – specifically the Product, FAQPage, and Article schemas. These are the signals that AI agents prioritize. API-First strategy: Ensure that inventories and prices can be retrieved in milliseconds via interfaces. Outdated data was the main reason for OpenAI's checkout failure – and the same mistake will be costly for retailers once agents actively book. Semantic enrichment with the Query Fan-Out Principle: Answer the questions an AI asks when comparing products on behalf of a customer: For which use cases is the product optimal? What alternatives are there? What are common purchase barriers? This depth distinguishes cited from ignored content. GEO strategy and build brand authority: Ensure that your shop is perceived as an expert entity in relevant categories – in ChatGPT, Perplexity, and Google AI Mode. More on this in our GEO audit → Secure DACH compliance early: PAngV and GDPR apply to AI-mediated purchases as well. Price reductions must disclose the lowest price of the last 30 days as a reference – and this must be machine-readable. Clarify this early with your legal advisor. Conclusion: Become a leader of the new era Agentic Commerce is no longer a science fiction scenario – it's the technological reality of today, still in development, but unstoppable. What OpenAI buried with Instant Checkout is a specific business model: the chatbot as a transaction facilitator between retailer and customer. What lives on – and is accelerating – is the underlying logic: AI shopping agents take over discovery, filter options, prepare purchase decisions. This already happens, daily, for millions of users. The question for retailers is no longer whether , but if they are visible when the agent decides . The companies that are ahead in two years are not the ones with the biggest budget. They are the ones with the best data, the strongest GEO presence, and the clearest understanding of how Artificial Intelligence in e-commerce is used as a lever rather than a threat. Frequently Asked Questions about Agentic Commerce What is the difference between Agentic Commerce and traditional e-commerce? Traditional e-commerce follows the Search & Click principle: The user actively searches, compares manually, and buys themselves. Agentic Commerce follows the Ask & Done principle: An AI shopping agent takes over product search, price comparison, availability check, and – if authorized – the purchase completion fully autonomously. What is Agentic Shopping? Agentic Shopping is the practical manifestation of Agentic Commerce: The user formulates a concrete goal – such as "Order printer cartridge XYZ at the best price by tomorrow" – and an AI shopping agent carries out all steps independently: search, comparison, purchase. Why did OpenAI discontinue Instant Checkout? OpenAI faced three technical hurdles: lack of real-time inventory synchronization across millions of retailers, no infrastructure for tax collection, and no fraud prevention for agent-based transactions. OpenAI is now pivoting to Product Discovery – the checkout remains with the retailer. What is the difference between SEO and GEO? SEO (Search Engine Optimization) optimizes content for the Google search algorithm and for human users – the goal is the click. GEO (Generative Engine Optimization) optimizes for AI systems and Large Language Models that extract content and output as a direct answer – without the user clicking on a website. Both disciplines complement each other and build on each other. Is my shop legally safe for AI purchases in Germany? In the DACH region, you must pay particular attention to GDPR and PAngV (Price Indication Regulation). Price reductions must always disclose the lowest price of the last 30 days as a reference – also machine-readable for AI agents. Clarify this early with your legal advisor before you register for Agentic Commerce protocols. When is Agentic Commerce coming to Germany? ACP and the new ChatGPT shopping hub are currently US-first. However, Google Merchant Center and Google AI Mode are already active in DACH – AI overviews already appear in 33 percent of all German search queries. Experts predict that AI agents could reach a market share of 20-30 percent in European e-commerce in two to three years. The preparation starts now. Is your shop ready for AI shopping agents? We analyze your GEO visibility, your product feed, and show you where you are currently invisible to AI agents – and how you can change that. Request GEO analysis now → Sources & further links: CNBC, March 2026: “OpenAI revamps shopping experience in ChatGPT after struggling with Instant Checkout” – cnbc.com Forrester Research: ConsumerVoices Market Research Survey, March 2026 Gartner: Bob Hetu, Analyst, gegenüber CNBC, March 2026 The Information, March 2026: First report on the Instant Checkout withdrawal OpenAI Blog, March 2026: Official statement on Instant Checkout and new shopping experience Google: Universal Commerce Protocol – Announcement January 2026
Budget Killers in Your Account: Quickly Identify Unprofitable Campaigns and Optimize Google Ads
Mar 23, 2026

Karina
Nikolova
Category:
Search Engine Advertising

One of the main differences between SEA and SEO is time. While SEO measures need time to show growth and performance improvements, paid campaigns require quick actions as any delay costs money. Even if your campaigns appear to be set up correctly at first glance, you can’t rely on hope and a good gut feeling if they aren’t delivering profitable results. In the following article, I will demonstrate three signs that help you recognize unprofitable campaigns at first glance and what could be behind them. Additionally, I will show you specifically how you should optimize your Google Ads campaigns in these cases. However, before we get started, there are three points that can provide a quick explanation for poor performance. If your campaigns still perform poorly despite these factors, you should choose a different approach to improve the figures and reduce Google Ads CPCs . Your tracking isn't working It’s a commonly underestimated problem: Unexpected changes on your website, such as the creation of new landing pages or migration to other data platforms, can disrupt your tracking. This can result in your campaigns showing 0 conversions. Ideally, the Google Ads managers are informed in advance about such planned changes, but in reality, that’s not always the case. An example: Once, a client of mine removed a CPA button that we had measured as a soft conversion goal. My campaigns began to struggle significantly, and I had to quickly find a solution to reduce Google Ads costs. In the end, we couldn’t see any conversions because there was literally no conversion action on the website that could trigger conversions in Google Ads. Tip: Regularly check if your tracking is functioning correctly. Without working tracking, you cannot optimize your Google Ads. It’s still possible for conversions to be generated, but they won't appear in Google Ads, only in the backend. Once the tracking problems are resolved, your campaign might perform well again. Your campaign is still in the learning phase Paid campaigns need patience, even though we all want to see good results as quickly as possible. That would prove our expertise and help us further optimize and scale the Google Ads campaigns. However, new campaigns cannot always work wonders, as the algorithm needs time to learn and improve performance. The official learning phase usually lasts up to four weeks. Depending on the business model, this process can also be shorter because the quicker the campaign generates conversions, the faster the algorithm learns. However, this development is not always guaranteed. For instance, the average customer journey in the B2B sector generally takes more time. Additionally, it often includes several touchpoints before achieving the desired result. Tip: Be patient during the learning phase. Your main goal is not clear Unrealistic expectations usually lead to disappointments - not only in life but also in Google Ads. If marketing goals are vague, clear results will not follow either. If the goals are clear, but you don’t know which campaign types are suitable for them, the figures will also disappoint. For example, if you work with display or video ads, you should not automatically expect to receive many high-quality leads. Not because your setup is wrong, but because these campaign types pursue different goals. They are meant to increase the awareness of your product and cover the early phase of the customer journey. Moreover, the ad formats are tailored to this goal - think of skippable ads on YouTube. They are there to promote your brand and convey a message. However, it is not realistic to expect good leads from them, as they are likely to be skipped, with the customer taking no further action. If your shopping campaigns don’t deliver results for weeks, this is at least alarming. Tip: Define clear objectives for each phase of the funnel and choose the appropriate campaign types. Only then can you effectively optimize your Google Ads campaigns. There is a Budget-Killer in the House But let's go back to the three clear signs that a budget-killer is present in your account: Campaigns with traffic but no conversions Rising CPAs Decreasing ROAS If your goal is conversions and you see none or increasingly fewer, there’s a problem. Especially if your tracking is functioning and the learning phase is complete. If the campaign still does not deliver the desired conversions, this impacts not only your KPIs but also the performance of your automated bidding strategies. For instance, if you optimize for tCPA or tROAS, declining conversions will lead to a higher CPA, a lower ROAS, and overall restrictions on bidding strategies. Here is a list of factors that could explain the decline in conversions you are observing. These include: Landing page – Any change that worsens the user experience can negatively influence the conversion rate as well as the bounce rate. Competition - Especially in e-commerce, competition through lower prices can affect the number of conversions as well as the conversion rate. Seasonality - If your business experiences significant declines during certain periods, you should adjust your marketing strategy accordingly. Irrelevant Traffic - Ensure that your ads don’t appear for irrelevant search queries to reduce Google Ads costs for poor traffic. This often helps to lower Google Ads CPC. Faulty Targeting – A reasonable campaign setup is vital in Google Ads. However, despite optimal campaign setups, certain target groups or keywords may perform less well than expected. For this reason, you should quickly optimize the targeting of your Google Ads campaigns if the desired results are not there. Google Ads campaigns are not static. What works well today can perform poorly tomorrow. As a marketing manager, you should thoroughly understand the business model and goals, select the appropriate campaign types, set KPIs, and set realistic expectations. The rest lies in flexible and smart Google Ads optimization. Additionally, your task extends beyond Google Ads as overall performance is influenced by many other factors described above. For example, dramatic political or economic developments can have the same negative impact as a poorly optimized campaign. Your Google Ads expertise should go hand in hand with thorough market analysis so that you can see the bigger picture and take the right actions. If you need assistance with this or if you want to scale your existing campaigns, our SEA team is happy to advise you. Contact us now!
Identify and Properly Analyze AI Traffic in Google Analytics
Mar 9, 2026

Nadine
Wolff
Category:
SEO

Since Large Language Models (LLMs for short) have become part of everyday life and users increasingly use AI tools like ChatGPT, Gemini, Claude, or Perplexity, a completely new traffic source has emerged. For website owners and marketing managers, the question is increasingly becoming how many users actually reach their website via links and recommendations from these LLMs and how large the share of this AI-generated traffic is in overall visitor volume. This traffic, let’s call it “AI Traffic,” is not automatically shown in Google Analytics. In this article, I’ll show you how to find, measure, and evaluate AI Traffic in GA4. At the same time, you’ll learn what conclusions you can draw from it for your planning and why AI visibility will be just as relevant in the future as classic search engine rankings. What exactly is AI Traffic and how is it composed? The term AI Traffic refers to all website visits that originate from AI systems and generative search engines. Here are some examples of where the traffic could come from: Traffic from ChatGPT/GPT Search Traffic from Perplexity Traffic from AI-integrated browsers (e.g., Microsoft Edge with the integrated Copilot) Copied links that users click from AI responses AI Traffic can be generated actively by users when they click links in an AI response. In addition, there is passive traffic when AI systems crawl pages to process content for their models. Recognizing AI Traffic in GA4: The Most Important Methods 1. Recognize referrers (e.g., ChatGPT traffic) When a user clicks a link from an AI response, the browser automatically sends a so-called referrer. This information indicates which page the user is coming from. In GA4, this appears in the “newly generated traffic” tab as “Referral,” for example with the source perplexity or claude. Figure 1: AI traffic via a referrer 2. UTM tracking For some time now, ChatGPT has automatically appended “?utm_source=chatgpt.com” to links it outputs in responses. This means that this AI Traffic appears in Google Analytics not as a referral, but as its own source with UTM tagging - and is therefore easier and cleaner to identify than plain referral traffic. Perplexity or other AI systems do not necessarily do this. This traffic is often only identifiable via the referrer. AI Traffic in GA4 - Make exploratory data analysis visible Exploratory data analysis in GA4 offers the most flexible way to evaluate AI Traffic in a targeted manner. Unlike in standard reports, you can freely combine your own dimensions, filters, and segments here. To do this, create a new empty data exploration and add a dimension and, if desired, one or more metrics: Dimension --> Session – Source/Medium Metrics --> Sessions Figure 2: Exploratory data analysis To see only traffic from AI platforms, now create a filter using a regular expression (regex). This filter ensures that only sessions are shown whose source is one of the AI platforms mentioned. Figure 3: Example of a regex that filters the various AI systems The result shows you - as in the example above - a detailed table by source and medium. One thing stands out: ChatGPT appears in two variants, once as “chatgpt.com / referral” and once with UTM tagging as “chatgpt.com / (not set).” This is because ChatGPT does not consistently append the UTM parameter to every link. It is therefore recommended to evaluate both entries together. What you see in GA4 - and what it means Once you have isolated AI Traffic in GA4, you essentially have three different metrics available: Size & development: How many sessions are generated via AI platforms? How does this develop over time? A growing value shows that your content is increasingly being recommended by LLMs as a source. This in turn is a direct signal of your AI visibility. Links : Which pages are being linked? Which of your subpages appear as landing pages? This metric shows you which content LLMs consider relevant enough to recommend. These are your strongest pieces of content in an AI context. User behavior: Time on site, bounce rate, and engagement rate of AI Traffic compared with other channels provide insight into whether the linked content matches users’ expectations. High bounce rates, on the other hand, can mean that the linked page does not deliver what the AI response promised. What you can infer from AI Traffic in GA4 The landing pages (with the AI Traffic) are your direct feedback on which content LLMs consider worth citing. Look at what these pages have in common: Are they more explanatory how-to articles? Detailed guides? Definitions? These patterns show you which content format LLMs prefer - and you can use that specifically for new content! Identify content gaps Get an overview of which topics your AI Traffic is coming from and compare them with your overall content offering. Are there topic areas where you get traffic but only have a few or thin pieces of content? These are your content gaps - areas where LLMs already see you as a relevant source, but you still aren’t fully realizing the potential. Optimize content specifically for LLMs (GEO) Generative Engine Optimization, or GEO for short, is the counterpart to classic SEO - but for AI systems. Specifically, the goal is to structure content so that LLMs can easily process and cite it. This includes clear, concise answers to specific questions, well-structured sections with clear headings, and trustworthy, source-based language. Pages that already receive AI Traffic are your best starting point - they are clearly already working, and targeted optimization can further increase their visibility in LLM responses. Conclusion: AI Traffic will become a strategic success factor Recognizing AI Traffic in GA4 is possible, but only with the right methods. Anyone who understands AI visibility and tracks it cleanly gains valuable insights into the relevance and future viability of their content. For companies, this means a new responsibility in content creation and technical optimization. If you need support with tracking, SEO/GEO, or AI content strategy, feel free to get in touch with us. Our team will help you make AI visibility measurable and align your measures based on data. Contact us now! FAQ What is the difference between AI Traffic and Bot Traffic? Bot traffic comes from classic crawlers, while AI Traffic results from AI systems and real users in AI interfaces. Is AI Traffic automatically marked in GA4? Not completely. Some systems are recognized, but much of it still has to be filtered out via segments or referrers. Which AI platforms should I track in GA4? The most important sources today are ChatGPT, Perplexity, Claude, Gemini, and Microsoft Copilot. ChatGPT is usually the largest source because it automatically sets UTM parameters and is therefore the easiest to identify in GA4. Is it worth analyzing AI Traffic if the volume is still low? Clear answer: Yes! Anyone who starts measuring and understanding AI Traffic now builds an advantage before this channel becomes the standard for the industry. Similar to SEO in the early 2000s, the same applies here: those who get in early benefit in the long run.
Optimizing content specifically for prompts using the Query Fan-Out principle
Feb 13, 2026

Julien
Moritz
Category:
SEO

Large Language Models (LLMs) like ChatGPT, Claude, or Gemini are fundamentally changing how content is found, evaluated, and utilized. Visibility is no longer solely achieved through traditional search queries but increasingly through prompts that users input into AI systems. A frequently mentioned principle for optimizing one's content in this regard is the so-called Query Fan-Out principle. But what does this specifically mean for your content? In this article, you'll learn how ChatGPT & Co. decompose inquiries in the background and how you can structure your content so that it is relevant, comprehensible, and quotable for LLMs. Key Points at a Glance LLMs generate multiple search queries simultaneously from a prompt (Query Fan-Out). These queries often run parallel in both German and English. Content is evaluated based on topics, entities, terms, and synonyms. In just a few steps, you can analyze which queries ChatGPT uses yourself. We show you how here. Concrete requirements for your content structure can be derived from this. What is Query Fan-Out? Query Fan-Out describes the process where an LLM generates multiple sub-queries from a single prompt. A prompt is thus unfolded into multiple queries. This multitude of queries is called Fan-Out because a query fans out like a fan into many individual queries. In the background, the system sends various search queries simultaneously to the index (e.g., Bing or Google). It is only from the synthesis of selected results that the AI compiles the final answer. We will examine how you can easily investigate this yourself for a prompt in a step-by-step guide. Why is Query Fan-Out so important? Your content aims to be found. However, Large Language Models are increasingly used today. This changes the requirements for your content so that it continues to appear in Google search results but is also used by as many LLMs as possible for answer generation. The better your content matches the generated queries, the more likely it is to be used by LLMs as a source. Step-by-Step Guide: What Queries are Created by ChatGPT? With a sample prompt in ChatGPT, one can clearly see how these queries appear. You can easily recreate this for your own prompts and optimize your content accordingly. Step 1: Open Developer Tools Open ChatGPT in the browser Enter a prompt and submit it Right-click somewhere in the interface Select “Inspect” Step 2: Filter Network Tab & Search for Chat-ID Switch to the “Network” tab Filter by Fetch / XHR Copy the chat-ID from the last part of the URL Paste it into the search field Reload the page Step 3: Select Network Request Click on the network request with the chat-ID in the name Switch to the “Response” tab Step 4: Find Queries Search for the term “queries” Now you see specific search queries that ChatGPT uses for web search Mostly in German and English Step 5: Evaluation of Requests The following prompt was entered: “I want to mount my TV on the wall. What is the recommended seat distance for a 65-inch OLED TV? I'm looking for a high-quality and safe full-motion wall mount. Compare current models and suggest the best ones!” ChatGPT utilizes two sub-queries in the web search to find suitable content: 1. DE: “ pivotable wall mount 65 inch TV recommendations wall mount TV 65" pivotable” 1. EN: “ best full motion TV wall mount for 65 inch TVs review high quality” 2. DE: “Recommended seat distance 65 inch TV distance OLED TV seat distance” 2. EN: “what is recommended viewing distance for 65 inch TV” From these queries, ChatGPT searches for appropriate sources and subsequently generates the following answer, with source indication: Now you should carefully look at the queries and also the sources used. What types of content are cited? The example used clearly shows that there is an information cluster and a comparison cluster. Different sources are used for these clusters. To be optimally found for this prompt, you need an informative article on the topic “Recommended viewing distance to TV” . From ChatGPT's query, it can be derived that the subtopics: TV size in inches and display types (e.g., OLED) should be addressed. Additionally, the synonym TV viewing distance should appear in the content, preferably in an H2. The product selection comes from other articles. Thus, your products should appear in as many comparison articles (on external websites) on the topic “Best TV wall mount” as possible, so they can be presented here. Additionally, ChatGPT accesses manufacturer websites. With your own content on product and category pages , you can influence the answers of LLMs. Clearly consider what makes your product or service unique and how you stand out from competitors. Because exactly these advantages can bring users from the AI chat to your own website. Additionally, it can also be beneficial to publish your own comparison articles . Naturally, you should strongly present your own brand within these, but also mention competitors and their advantages. LLMs recognize that the information density in the English-speaking network is generally higher. Translating your own content can therefore be a great advantage and ensure greater visibility with ChatGPT and others. Strategies for Optimization for the Query Fan-Out Principle What does the Query Fan-Out principle mean for your own content? You need an SEO strategy that works even in the age of generative AI. For this, we have developed five tips that you can directly implement. 1. Comprehensive Topic Clusters Instead of Keyword Focus The Google Query Fan-Out behavior shows the desire to capture topics in their entirety. LLMs divide a prompt into multiple thematic clusters with varying intent, such as information, comparison, or product queries. Informative content should be built comprehensively . Content should not only answer "What" questions but also "How", "Why", and "What are the alternatives?" Use targeted synonyms and related entities. If you write about “TV wall mounts”, terms like "VESA", "Pivotable", and "OLED television" should be included. 2. Direct Answers Write precise definitions and direct answers to user questions at the beginning of your paragraphs . An AI looking for a quick answer to a sub-query will more likely cite text that provides a clear answer: “The ideal viewing distance for a 65-inch OLED TV is about 2.50 to 2.80 meters.” Avoid unnecessary filler sentences just to include keywords. Further detailed and extensive information considering secondary keywords can be placed afterward. 3. Structured Data LLMs work resource-efficiently and love structure. When an AI conducts a price comparison or technical analysis, it preferably accesses data marked up with Schema.org . Use structured data in JSON-LD format to make products, FAQs, and reviews machine-readable. 4. Internationally Visible Content Often, Large Language Models automatically generate English-language queries, even when prompts are written in German. Therefore, building internationally visible content is increasingly important, even if your target audience is German-speaking. You should provide your core content in English as well. 5. Building "External" Visibility Transactional inquiries like “Best price-performance TV wall mount 2026” are answered using comparison content and user reviews . To be visible with your brand in LLMs, you need to build recognition. Content partnerships with magazines or collaborations with influencers who publish independent reports and product comparisons are a strong lever. It’s not just about classic backlinks that provide authority but also about mentions of the brand in a relevant context on as many platforms as possible. This can be articles from magazines, competitors, online retailers as well as UGC content on YouTube, Reddit, etc. Conclusion: SEO & GEO United Query Fan-Out reveals how LLMs find and evaluate content. By structuring your content to answer multiple questions simultaneously, being thematically complete, and considering relevant entities as well as synonyms, you optimize not only for traditional search engines but deliberately for AI systems. This is where a new form of visibility is currently being created. Optimization for the Query Fan-Out principle is no longer a "nice-to-have", but the new foundation for digital visibility. By understanding how LLMs deconstruct queries, you can create content that is not only found but also cited as a trustworthy source. If you need assistance or want to optimize your content specifically for LLMs, our SEO / GEO team can gladly advise you. Contact us now!
ChatGPT for Ad Copy: Turning Strategic Decisions into Measurable Performance
Jan 30, 2026

Yasser
Teilab
Category:
Search Engine Advertising

Good ads rarely emerge from a sudden spark of inspiration or pure creative chaos. In the world of performance marketing , they are the result of a rigorous process: clear decisions, sound hypotheses, and the relentless willingness to test them in the market against the reality of data. At this point, ChatGPT for ad copy becomes either a highly effective precision tool or a mere text production machine that just creates digital noise. AI does not determine the success of a campaign; it merely exposes how structured your marketing thinking really is. In this guide, you'll learn how to transform ChatGPT from a "writing aid" into a strategic performance tool that elevates your Google Ads and Meta Ads to a new level. This strategic approach is exactly what we implement at internetwarriors daily in Google Ads and Meta Ads – data-driven, test-based, and scalable. Book an appointment with us now! The Paradox of AI Text Production: Why More Content Doesn't Automatically Mean More Success Ad copy has always been a test problem. Marketers formulate assumptions, launch them, and let the numbers decide. The real limit was never in tracking or analysis, but in operational capacity. Every new ad, every new "angle" took time in conception, coordination, and creation. ChatGPT has shattered this limit. A new entry or an alternative tonality can be developed in seconds today. But here's the trap: those who misuse ChatGPT only scale mediocrity. The shift in everyday work: • Previously: The bottleneck was writing (copywriting). • Today: The bottleneck is thinking (strategy & psychology). ChatGPT doesn't think strategically. It doesn't decide which message is relevant in the market. If ads didn't work before, ChatGPT won't solve this problem – it will only accelerate failure by producing more bad ads in a shorter time. Preparation: Ad Copy Starts Not in the Prompt but in the Focus Much of what is perceived as "generic" AI text is not due to the model but to weak briefing. Before you type the first prompt into the chat window, one central question must be answered: Why should the audience click right now? The Psychology of the Click People don't click on ads because a product is "innovative" or "market-leading." They click because they expect a transformation. ChatGPT is excellent at translating a well-defined idea into variations, but it is unsuitable for finding that idea itself. What you need to define before using ChatGPT: The specific pain point: What exact problem keeps your customer awake at night? (Not: "They need software," but: "They're afraid of data loss"). The functional benefit: What improves immediately? (Time savings, risk reduction, status gain). Objection handling: What thought prevents the customer from clicking? ("Too expensive," "Too complicated," "No time to switch") Thinking in "Angles": The Framework for High-Converting Ads Those who use ChatGPT for ad copy should stop asking for "texts" and start thinking in angles . An angle is a conscious decision for a psychological perspective. Angle Type Focus Example (Project Management Tool) Efficiency Time savings & focus "Gain back 5 hours per week." Safety Error avoidance & control "Never miss a deadline again." Simplicity Low barrier & usability "Set up in 2 minutes. No training required." Social Proof Trust & benchmarking "Why 500+ agencies have switched." The Rule: An angle always corresponds to exactly one hypothesis. Only when the angle is set do we let ChatGPT formulate the variations. Defining, testing, and systematically scaling angles is not a creative but a strategic problem. If you want to know how we translate such hypotheses into high-performing campaigns, find out more about our approach now! ChatGPT for Google Ads: Mastering Responsive Search Ads (RSA) In Google Ads, AI plays to its strengths especially well with Responsive Search Ads. This ad format thrives on the combination of different elements. The most common mistake? Creating 15 headlines that all say almost the same thing. The Building Block Principle Effective RSA copy is created when each headline serves a clear function. We use ChatGPT to specifically serve these functions: • Function A: Problem description. (e.g. "Tedious Excel lists?") • Function B: Benefit promise. (e.g. "Automatic reporting at the push of a button.") • Function C: Trust signal. (e.g. "2024 test winner.") • Function D: Call-to-action. (e.g. "Request demo now.") Strategic Prompt Tip for Google Ads: "Create a total of 10 headlines for a Google Search Ad for Product [X]. Important: Create 3 headlines that address a problem, 3 headlines that mention a benefit, and 4 headlines with a strong CTA. Each headline must be a maximum of 30 characters long. Avoid repetitions." Meta Ads: The Battle for the "Scroll Stop" In the meta environment (Facebook & Instagram), the attention span is minimal. The first sentence – the hook – decides success or failure. ChatGPT as Hook Generator Instead of generating entire ads, it's more effective to use ChatGPT solely for the development of openings. A strong hook must pull the user out of their passive scrolling trance. Three Hook Formats to Test with ChatGPT: The Provocative Question : "Did your team really know what was top priority this morning?" The "Statistical" Statement : "78% of all projects fail due to poor communication – here's how to prevent it." The "Negative Framing" : "Stop wasting time in meetings that could have been an email." Important : Even if ChatGPT provides the text, manual verification of advertising guidelines (especially concerning sensitive topics like finance or health) is indispensable. Practical Guide: How to Brief ChatGPT Like a Pro To get results that don't sound like a "robot," you need a structured briefing framework. At internetwarriors, we often use the following scheme: Step 1: Role Assignment Always start by giving the AI an identity. "You are an experienced performance marketer and conversion copywriter. Your goal is to write texts that not only inform but also trigger an action (click/purchase)." Step 2: Context Input Feed the AI with hard facts: • Target audience: Specific persona (e.g. "CEO of small agencies, 30-50 years old, stressed"). • Offer: What is the irresistible offer? • Objection: What is the customer's biggest concern? • Tone: (e.g. "Direct, professional, without marketing clichés"). Step 3: Iteration Never settle for the first result. Use commands like: • "Make it shorter and more concise." • "Remove all adjectives like 'revolutionary' or 'unique'." • "Reword Angle 2 for an audience that is very price-sensitive." The "Warriors Check": The 5 Most Common Mistakes in AI Ads To prevent your performance campaigns from sinking into mediocrity, avoid these mistakes: Too much trust in the facts: ChatGPT sometimes hallucinates. Always manually verify USPs and data. Missing brand voice: If the AI sounds too much like a "salesperson," you'll lose your target audience's trust. Adjust the tone. Ignoring platform logic: A text that works on LinkedIn will fail miserably on Instagram. Adapt the formats. No A/B testing: Many marketers use AI to find a perfect ad. The goal, however, should be to find five radically different approaches and test them against each other. Marketing buzzword bingo: Words like "holistic," "synergistic," or "innovative" are click killers. Instruct the AI to remove these words. Outlook: The Future of Ad Creation We are moving towards an era where AI will not only adapt text but also images and videos in real time for individual users. Yet even in this world, one constant remains: Strategy beats the tool. Those who learn today to use ChatGPT as a partner for hypothesis building and angle development will have an unbeatable advantage. It's not about writing faster – it's about learning faster what works in the market. Conclusion: ChatGPT is Your Lever, Not Your Replacement If ChatGPT has so far primarily served as a tool to "quickly create a text" in your setup, much of the potential remains untapped. The decisive lever lies in the systematic interlocking of psychological know-how, clean structure, and the speed of AI. This is exactly where we at internetwarriors come in. As specialists in Google Ads and Meta Ads, we help companies: • Strategically build ad copy processes. • Integrate AI meaningfully and data-drivenly into campaigns. • Develop scalable setups that are based not on chance, but on validated hypotheses. Do you want to use ChatGPT not just as a typewriter but as a real performance tool? We support you in sharpening your messages so that they are not only seen but convert. Contact us for a non-binding analysis of your current campaigns! This article was created with AI assistance – but curated with the strategic mind of a warrior.
2026 und das Zeitalter der Agentic Search - Wenn Kunden keine Menschen mehr sind
Jan 14, 2026

Axel
Zawierucha
Category:
Growth Marketing

Here you will find all parts of our blog series: Part 1 - Why "Zero-Sum" is a misconception and the search is just beginning | find it here Part 2 - The "December 2025 Core Update" and how to regain visibility | find it here Part 3 - Advertising in the Age of Conversation – Why keywords are no longer enough | find it here ————— Blog Series: The Transformation of Search 2026 (Part 4/4) Welcome to the future. Or better yet: Welcome to the present of 2026. In the previous parts, we analyzed the traffic crash and explored new advertising tools. To conclude this series, we venture a look at what is emerging: The "Agentic Web". The biggest change ahead is not how people search, but who searches. We are experiencing the transition from information gathering to task completion. "Preferred Sources": Democratization of the Algorithm Let's start with a technology that is already here and will change SEO forever: "Preferred Sources". In late 2025, Google deployed this feature globally. Users can now actively mark news sources and publishers (with a star) that they prefer. Why is this revolutionary? Until now, SEO was a technical battle against an anonymous algorithm. Now, brand loyalty becomes a direct ranking factor. If users mark your page as a "Preferred Source", your content receives a permanent boost in their feed – completely independent of what the next Core Update dictates. This means: Community > Keywords: A small, loyal fan base is more valuable than broad, volatile traffic. Trust as a metric: You must actively motivate your users to choose your brand as a preferred source. This is the new newsletter signup. "Live with Search": Seeing the World Through the Camera SEO has been text-based so far. With "Live with Search", it becomes multimodal. Users can now interact with Google in real-time via camera and voice. A user films a shelf at the hardware store and asks, "Which of these anchors will hold in drywall?" Thanks to the new Gemini Native Audio Model, Google responds smoothly, like a human advisor in your ear. The implication for brands: Their products must be visually identifiable. Packaging design becomes SEO. And: Your website must answer questions posed while viewing the product, not just while searching for it. "Agentic Search": From Searching to Doing The term of the year 2026 is "Agentic Search". An AI agent (Agent) is more than a chatbot. A chatbot gives information. An agent acts. Search 2024: "Show me flights to London." Agentic Search 2026: "Book me the cheapest flight to London on Friday, take my preferred aisle seat, and add it to my calendar." Experts predict that the market for AI agents will explode to over 50 billion dollars by 2030. For us at internetwarriors.de, this means a radical shift in "Search Everywhere Optimization" (SEO). If your "visitor" is a bot, it doesn't need a nice design. It needs APIs, clear schema.org structures, and flawless logic. We no longer optimize websites just for human eyes, but for machine actors. Gemini in Translate: The Global Competition Finally, the last bastion falls: The language barrier. With the integration of Gemini into Google Translate, translations become context-sensitive and culturally nuanced. A US shop can suddenly serve the German market as if it were locally established, thanks to real-time translation. For German companies, this means: Competition becomes global. But their opportunities also become global. Conclusion: The Year of Decision The transformation of search 2026 is not a threat to those who provide quality. Redundant information becomes extinct (December update). Transaction and expertise prevail (Liz Reid theory). Advertising becomes smart and context-based (AI Max). Brand loyalty beats algorithm (Preferred Sources). At internetwarriors , we are ready for this era. We help you not only to be found but to be chosen – by people and agents. Let’s discuss your strategy for 2026 together. Schedule an appointment now .
Werben im Zeitalter der Konversation – Warum Keywords nicht mehr genügen
Jan 13, 2026

Axel
Zawierucha
Category:
Growth Marketing

Here you will find all parts of our blog series: Part 1 - Why "Zero-Sum" is a misconception and the search is just beginning | find it here Part 2 - The "December 2025 Core Update" and how to regain visibility | find it here Part 4 - 2026 and the Age of Agentic Search - When customers are no longer human | find it here ————— Blog Series: The Transformation of Search 2026 (Part 3/4) In the first two parts of this series, we've analyzed the economic theory behind Google's transformation ("Expansionary Moment") and the brutal reality of December's update for SEOs. But while SEOs are still licking their wounds, SEA managers (Search Engine Advertising) need to reforge their weapons. The year 2026 marks the end of classic keyword dominance. With the introduction of "AI Max for Search" and the opening of "AI Mode" for advertising, Google has fundamentally changed the rules of monetization. Trying to bid exact keywords ("Exact Match") against an AI today is like fighting drones with bows and arrows. In this article, we deconstruct the new advertising infrastructure and show you how to run ads in a world where users no longer search but engage in conversations. AI Max: The "Intent Engine" Replaces the Keyword For a long time, "Performance Max" (PMax) was the panacea for Google's inventory. But there was a gap for pure search campaigns. This is now filled by "AI Max for Search," a tool that Google markets as a "One-Click Power-Up." The Problem with Keywords Imagine users searching: "I need a car for 3 kids and a dog that runs on electricity and costs under $50,000." Previously, you had to bid on combinations like "electric SUV," "affordable family car," or "7-seater." It was necessary to guess what users would enter. AI Max turns this principle on its head. It analyzes not the words (strings), but the intent. How AI Max Works AI Max uses your website and its assets as a foundation. When users make the above complex request, the AI understands the context ("family + space requirement + budget constraint"). It scans your landing page, finds your model "E-Family Van," dynamically generates a fitting headline (e.g., "The perfect E-Van for your family of 5"), and displays the ad – even if you have never booked the keyword "dog." The results speak clearly: Beta tests show a 27% increase in conversions with a similar CPA (Cost per Acquisition) compared to pure keyword campaigns. Strategic Advice: Keywords become mere "signals." Your landing page and your creative assets (images, text) become the real targeting. If your landing page does not answer the question, AI Max cannot generate an ad. The "AI Mode": Ads in the Conversation The "AI Mode" is Google's answer to ChatGPT and Perplexity – a purely conversational interface capable of handling complex, multi-step inquiries. The crucial question for advertisers has long been: Where is the space for advertising here? The answer is: Sponsored Responses . Integration Instead of Interruption Unlike the classic search where ads are often perceived as disruptions, Google integrates ads seamlessly into the dialogue in AI Mode. Scenario: Users plan a trip to Tokyo and ask the AI Mode about hotels near Shibuya Crossing with a pool. Advertising: Instead of a banner, your hotel appears as part of the response, marked as "Sponsored," including an image and direct booking link. Since inquiries in AI Mode are "2x to 3x longer" than in classic search, the algorithm receives significantly more context signals. This enables targeting with unprecedented precision. A user who asks so specifically is deep in the funnel. The click rate may decrease, but the conversion rate rises. The New Currency: Assets To participate in AI Max and AI Mode, you need "raw material." The AI assembles the ad in real time. This means for you: Visual Excellence: You need high-quality images and videos. AI Max prioritizes visual elements to create "Rich Cards" in the chat. Structured Data: Your product feed (Merchant Center) must be flawless. The AI needs to know if the shoe is "waterproof" to display it for the query "running shoes for rain." Broad Match + Smart Bidding: This is the technical prerequisite. "Exact Match" cuts you off from the new AI interfaces. You need to release the algorithm (Broad Match) but control it through the target (Smart Bidding on ROAS/CPA). Conclusion for Part 3 We are moving from a "Search Engine" to an "Answer Engine." Advertising must become the answer. Banner ads are dying out; helpful, context-sensitive product suggestions take over. Don't throw away your keyword lists, but treat them for what they are: relics from a time when we still communicated with machines in "telegraphic language." Need help transitioning to AI Max? The SEA team at internetwarriors audits your account and prepares it for 2026.
Das "December 2025 Core Update" und wie man die Sichtbarkeit zurückgewinnt
Jan 12, 2026

Axel
Zawierucha
Category:
Growth Marketing

Here you will find all parts of our blog series: Part 1 - Why "Zero-Sum" is a misconception and the search is just beginning | find it here Part 3 - Advertising in the age of conversation – Why keywords are no longer enough | find it here Part 4 - 2026 and the Age of Agentic Search - When customers are no longer people | find it here ————— Blog Series: The Transformation of Search 2026 (Part 2/4) While Liz Reid emphasized the economic stability of Google search in interviews, dramas were unfolding in server rooms and marketing departments worldwide. The "December 2025 Core Update" will go down in history as one of the most volatile and toughest updates. It was not merely a correction; it was a system change. In this second part, we analyze the forensic data of the update, explain why "Redundancy" is the new "Spam", and show you a way out of dependency with the new "Preferred Sources" feature. Holiday Havoc: The Timing of Terror The update began on December 11, 2025, at 9:25 AM PT and extended until January 1, 2026. For e-commerce and ad-funded publishers, this timing – in the middle of the busiest quarter – was the "Holiday Havoc". The impacts were brutal and immediately measurable: Traffic Collapse: Hundreds of webmasters reported declines in daily visitor numbers between 70% and 85% . Discover is dead (for many): Google Discover was particularly affected. A publisher documented a drop in impressions by 98% within days before the official announcement. Since Discover now accounts for up to two-thirds of traffic for many news sites, this was tantamount to a threat to existence. Volatility Index: The SISTRIX Update Radar recorded a value of 3.54 on the day of the announcement – a massive spike far beyond normal fluctuations. The "Second Wave": Why it hurt twice Our analyses at internetwarriors show an unusual pattern. After the initial crash on December 11, there was deceptive calm, followed by a "Second Wave" of volatility around December 20. We interpret this as a two-stage filtering process: Phase 1 (Content): The algorithm scanned for static quality features and especially for redundancy. Phase 2 (User Signals): In the second wave, the user data of the new AI Overviews was analyzed. Pages that ranked but didn't generate clicks or had high bounce rates compared to the AI response were downgraded retroactively. The new ranking poison: Redundancy Why were so many established sites hit? The answer lies in the nature of AI overviews. Previously, a page was valuable if it summarized information well. Today, the AI does that. The December update punished redundancy. If your page merely repeats facts already present in Google’s "Knowledge Graph" (e.g., "How tall is Liz Reid?"), your page is technically redundant. It doesn’t offer added value over AI. Google has now firmly integrated its "Helpful Content" signals into the core algorithm. "Helpful" today means: Does this page offer a perspective, experience, or data that AI cannot hallucinate or aggregate? The Glimmer of Hope: "Preferred Sources" But Google didn’t just take, Google also gave. Parallel to the update and volatility, Google rolled out the "Preferred Sources" feature globally. This is perhaps the most important strategic innovation for 2026. What is it? Users can mark their preferred news sources in search settings or directly in "Top Stories" (through a star). The Effect: Content from these sources gets a permanent ranking boost in the user's personal feed and appears in a separate section "From your sources". This fundamentally changes the SEO game. Until now, SEO was a battle for the algorithm. From now on, it is also a battle for brand loyalty. A small niche blog can outperform large publishers if it has a loyal community that actively marks it as a "Preferred Source". We see here a democratization of the algorithm: the users decide who ranks, not just the AI. Your Survival Strategy for Q1 2026 Based on this data, we recommend our clients the following immediate actions: Redundancy Audit: Check your content. If you have an article that ChatGPT could write just as well in 10 seconds, delete or revise it. Add exclusive data, expert opinions, or videos. The "Star" Campaign: Launch campaigns to encourage users to mark you as a "Preferred Source". Explain to users how it’s done. This is the new newsletter signup. Diversification: Do not rely solely on Google Discover. The 98% drop shows how volatile this channel is. The December update was painful, but it has cleansed the market. Whoever still stands now has substance. But how do you monetize this substance in a world where keywords are losing importance? In part 3 of our series, we dive deep into the new advertising world of AI Max and AI Mode , and show you how ads are placed when no one is searching anymore.
Show more posts