Is an online advertising format in which the notice (commonly known as banner and consists of text, image, audio, and even video) is displayed in a web landing page, usually presented at the top or side thereof.
And what began as a mere static image with new text has evolved more interactive advertising formulas. Now banners can include audio, video and even may offer some user interaction. This gets more attractive undoubtedly increase the CTR (Click Throught Rate or percentage of clicks on the number of prints) over previous versions. Today we start with Plista, which is like Zanox.
Also this video its nice:
with this tools we can see which content is the most shared across the social networks:
Importance of Content Marketing, for example you have an "Anzeige", but when u click the complete article is an Anzeige:
With mouseflow, you can see where are your clients in your website (heatmaps, live mouse tracking, link analyse, etc):
--> is predefined on each bottom and integrated into html -->Viewed items the customer / visited pages are shown as advertising.
Display Advertising with Google Adwords (steps)
Step 1: Create 1 Campaign
campaign
only display network
remarketing can be selected
select a bidding strategy (clicks or impressions , or conversions )
define locations / languages / equipment etc.
define start and end dates
display rotation can be selected: the display with the most clicks will be used - according to experience gather useful
frequency capping : this determines how many banners are displayed to a user in the remarketing .
IP addresses to exclude (eg within a company in order not to display the banner its own employees )
A / B tests are possible
Step 2: Create 2 ad groups
set budget
define interests of the target group (by selection)
combined target group definition is possible
Step 3: Create 3 ads
Upload file and display design and write text display
Banner format determine (possibly determine multiple formats to be integrated flexibly )
Orientation / exclusion options
certain keywords can be excluded ( the environment in which I do not want to be placed
exclude interests
exclude certain places
specific target pages can be excluded (example: Bild.de)
parameters contained in the URL can be defined
Targeting forms
keyword targeting
the display is switched depending on the search behavior of the customer.
the user enters keywords into a search engine
in the campaign Keywords for display in the run-up to be determined
contextual targeting
Ads are running targeted to a relevant environment
Advertising will be placed when certain keywords appear on the site
Danger: misplacement in a negative environment (example: car ads in a report on car accidents ) can be avoided.
semantic targeting
the content and meaning of the text is analyzed
thematically Related ads to show
misplacement in a negative environment (example:car ads in a report on car accidents ) can be avoided.
behavior targeting
the display is switched depending on the behavior of the user. Example: user often searches for cars or car is often located on sites - > Car advertisement is displayed to him .
Re - Targeting
the user has already performed an action before it , the display will appear.
example: Clicking on a product -> Display is switched to convert interest into account.
Hummingbird Updates: is a search algorithm used by Google. With this the search can be a more human way to interact with users and provide a more direct answers. What changes Hummingbird from Penguine or Panda?
With the creation of Hummingbird, Google goes to what some years ago was spoken: The semantic web. This is an internet trying to understand the user and not only to offer information. This means that the meaning of what you write is more important than the exact words you do.
For example, if you search for "Zara" you will find different information than it was before. To begin use the location (if you have access to it) to provide more useful information to the user. Should not have allowed you to add it. Whenever you can you bring that picture to the right with more information giving you the option to search for related information (Google Knowledge Graph)
Keywords era is over and we welcome the new era of meaning and intention.
It is not about setting our page and its contents to match the keywords that users type in the search. Now content should be focused on providing answers to those users seeking.
If you focus on the marketing content, you manage high quality content, you care ofrececes links and solutions to the needs of your users, you will be completing the requirements that currently requires google hummingbird.
We do not know specifically what factors determine the exact position, but think that everything that is synonymous with quality will be positively positioned: Count time on pages, links quality, low bounce rate, etc
--> Karl Kratz can give more details about the topic here --> Onpage.org:This tools helps you to analyse which content you should produce to push your website.
WDF IDF WDF * IDF
The formula WDF * IDF is possible to determine the ratio in which certain words in a text document or a website will be weighted in relation to all potentially possible documents. This formula can be used for onpage optimization to increase the relevance of a website for search engines, without compromising the keyword density alone plays a role.
-->WDF is measuring how many times a keyword is used on all pages of
one website. The scala is determinated by the pages that you are
comparing. I can see where the competitive page thas a higher value, for
example: städtreise --> then i can choose to produce contect for
this keyword.
-->IDF Show how many the keyword is used in all the web.
--> wdf xidf is an other indicator which helps to decided if it makes sense to work on improving content or to search for nische keywords.
-->the higher the WDFxIDF Factor , the higher the competition and the more content I woukd need to produce to reach the same value or to get closer Text in Content Few tools for the clean create texts:
-->textinspektor.de (TA- Index) -->leichtlesbar.ch ( FL value ) --> stilversprechend.de (Marked possible improvements such as noise words, Wortdoppelungen to long sentences and passive rate) --> homepageentwickler.de / text - analysis Tool.php (Marks repeated words and filler words) --> schreiblabor.com (Marks filler words, anglicism , long sentences and long words - length of the words can be adjusted manually ) --> wortliga.de / text analysis 2 important Conceps: co-citiation - co-occurence
Create a briefing for a PR - Officer (Ein Briefing erstellen für einen PR - Beauftragten) First be clear in which Seite Structure / URL - keyphrases are we working
Keyword: Rucksack Reise Chile
Synonyme
density
WDF+IDF
type of contents (text style, types of languagges)target group
text length
Links (anchor text)-out
Accentuate with multimedia content
Online PR (dissemination, spread)
Interaction (answers of Question, explanatoryvideos, excitation comments, etc)
Today we are talking about the importance of the links. The companies should take a look and make a review to the old links and also pay attention to the new ones. why? because this can help to the user to find the correct product and also make the category stronger. --> Remove Bad Links: first directly by addressing the link translator or do it with a disavow tool. --> Build good links:
Today we start talking about the importance of the content for a good off page optimization.
I 'm attractive for linking from the outside. I
make sure my sites content when I separate the domain (blog.reisen.de /
shop.reisen.de). We can also use HTML (canonical tag) for explain which one of our contents it´s the most important. The relevant site shall have the mark "rel = canoncial" in the html or webmaster tool.
With
a Spider Simulator (WooRank, page report, webconf, website analysis)
I can check how Google interprets my site . Google assigns specific
areas of text attributes , type and importance to . Google analyzes the
names of the links , the anchor text, the content. Unnatural links
(anchor : backpack - travel - chile / I was on a great backpacking trip
in Chile...) are punished, as well as duplicate content.
Google
finds the source of unique content, it means we should avoid the copy of content.
High quality backlinks must be relevant topics . Page content and page
titles must match the search query. Immerse keyword unnatural
frequently, the inferior content is (will be registered as spam).
Bad
content achieves a short residence time (time on page / time on site ),
and a high bounce rate ( bounce rate ) - both push me in the PageRank
very quickly to the rear. Google observes the user experience,
considered the crosslinking density , the content is different in terms
of their relevance and recognizes mention contexts.
Sistrix (free : smart.sistrix.com) gives me a good overview of my current ranking ,
other competitors, inbound links , the linked domains and hosts , etc.
The visibility index indicates how I am present in the network. My keywords are weighted according to position
and search volume and the weighted values sum to my visibility index.
External
links to make purchases ; of course they are only useful if the
contents fit . A regular backlink check (backlinktest.com,
validator.w3.org, linkresearchtools.com) shows me that all the links
are still clean.
How to improve our web site?
--> using natural link setting --> also separate the content in thematic pages link setting --> links from different IP addresses and different partners as many links --> avoid too many affiliate links, because this can make your page weak Criteria for partner selection (also in relationship with affiliate marketing)
--> pagerank --> indexed pages --> third-party sites to evaluate: Alexa --> anchor text should be natural and not to link trade point (not too often the same keywords) --> keyword should be in the vicinity of the links --> unique content --> quality backlinks --> not keyword spam (natural density)
Parameters for the evaluation of the partner -->pageRank -->alexaRank --> visibility Index -->authority in the network -->quality: onpage analysis; Quantity of links, link popularity, domain popularity , IP popularity; Trust flow (residence time , etc.), Citiation flow (number of links, where do the links and attributes: Dofollow, notfollow, side wide link, image links, text links) --> status indixierter pages ; Identify themes relevant
Link types
--> FollowLink --> Nofollowlink ( Quantity is counted , but not strength is given ) 2/3 rule --> Side WideLink --> Post Link
Some programs for make a free SEO Analysis:
Panda (Quality of the site is measured by the content)
Penguin (It shows me where are the links, because the webspam should be avoided)
Link Exchange -->Link Exchange in a circle does not make sense, because it is mutually canceled. --> more info about Backlinks here
Hi there cyberworld... today we talk about the importance of the keywords in SEA: Page Structure / Internal Linking
-->Domain (should always be the " strongest") -->We should use Categories(for example: Chile Travel) --> We should use subcategories (for example: group travel) Take a look for example what Zalando does:
(Not all pages should be linked with all but about one can only come to the subcategory "group travel " when the category was called Chile ) Reisen.de / chile / group travel / mountains / testimonials / book
The link from the domain of the first categories should not be much , so about 30 categories, but only just only 10 links about , and from there to the subcategories.
Tool for analyze the structure of a page Homepage:
https://strucr.com/
-->Also this link is quite interesting for read:
-->Important to know: in order to know how many pages (number) has a domain, simply use the command site : prescribe the domain in the Google search. For this there´s an important article: Google 200 Ranking Faktors
There´s also a couple of concepts that are helpfull to know:
Google pagerank: how important is our webpage for google (score 0 is not gut, 10 is the best one)
Google Index: we can have for example 222 pages on the web, and then from this total i get the pagerank. The thing is that we cannot know which parametre use google for give the position to the web page, we can have a lot of pages and low pagerank or also the opposite...
Webmaster tools!
how cool is this program!! you can know many things, just take a look in the image! This information are part of the search results and they are generated automatically by google, but they can only show them up if you help google to find these things on your website, blog, etc.
The webmaster tool includes various services / tools:
Submit Sitemaps, and verify these
See Incoming links to the domain
See indexing details of the domain
Receive HTML suggestions
Evaluate crawling problems , such as display dead links (Broken Links )
check robots.txt
Search query report , the keywords lists for which the domain was found
A preferred domain set ( for example, www.example.com or example.com )
Determine crawling behavior of parameters
Configuring Site Links
Confirm a domain transfer
For this HTML markup are needed . With these, one featuring content so that Google can correctly display in the search. To tag a page, just use the Data Highlighter . This has several categories :
With the highlighter I then mark for Google relevant content on my website:
--> Under HTML improvements suggestions for improving the code are proposed.
--> Under Sitelinks I can exclude sub-pages from the Site View as link.
Searches --> From this point I can see how visitors have come to my site ( search queries, links) and how they have moved there ( internal links ) .
Google index -->Here, let the indexing status, see the main content keywords and URLs excluded.
Crawling --> In this section there are crawler statistics , error messages (if pages are missing , for example, links lead to nowhere , etc.), blocked URLs and the ability to set up a site map.
There´s a neutral platform schema.org, which is independent from Google and we can use it for organice the information and improve the search.
Also this microdata generator:
Content Keywords: It´s important to know the value of every keyword so then we can puch one specific content, for example: marketing- management
Robots.txt
The robots from google! these are our friends and we can talk with them for improve the results in the search: I can tell for example when i want that a content is allow or not allow (make priority), for example: allow/images-reisen
and i can also say which of my contents is the most important or the real one (then the robot knows which are applications or different blogs in other countries:
Meta-Tags
Meta elements are tags used in HTML or XHTML documents to provide structured metadata about a Web page. They are part of a web page's head section. Multiple Meta elements with different attributes can be used on the same page. Meta elements can be used to specify page description, keywords and any other metadata not provided through the other head elements and attributes.
I recommend this video, is very clear and help us to understand the importance of the SEO strategy and how can we do this with the Web Master Tools Program.
Today we start with SEO (Search engine optimization), it means, how can we improve our website and help to get good results in the search. The Google crawler crawls all sites to their content filter out / interpret etc.. Doing so, he looks into the HTML code of the website. Basically, there are two major sections:
Head: Contains invisible for user components, but which are evaluated by the crawler. There is for example the keywords meta tag, which is, however, now no longer respected by the Google crawler (from other search engines, but will). Body: The visible for users of the website. If, of course, also searches the Google crawlers.
To get an idea of how a crawler sees a Web page, a look at a spider simulator, for example http://www.webconfs.com/search-engine-spider-simulator.php:
It is weighted / analyzed:
Order (What is at the beginning, which at the end?)
Density (What are words like often called?) --> (Density beetween 3 and 6 it´s ok and more it´s shit)
What is in a title tag?
What is in the URL? What in the subdomain / folder name?
SEOQuake can provide a density analysis any public page:
Page structure:Keyword relative optimize (Short-, mid-, long tail)
Basically, the shorter the URL, the better! Domain Name (Brand or Keyword): www.suedamerika-reisen.de
About Category: Chile Travel Sports Travel Chile Backpacking Chile ... About Category: Argentina Travel Sports Travel Argentina Backpacking Argentina Important: info in the "header, subtitle" --> we have to give a name always, then we can have more chance to be in the search.
--> Black Hat SEO: they use any method available to provide visibility and location to web site or page you want to promote. No media interest or ethics, only the quick result. What is sought is to appear in search engines, not quality content. They do this in several ways: creating content with or without links to the target page group dummy pages. Blog Spam, when the methodology is used to send spam links to any blog that accepts into its configuration posting links. Scraping, which is in short content plagiarizing a popular search engine and paste in the web site itself. Parasite Hosting, which is hosting a Web site on the server of someone else illegally and without their consent. As more complex programming techniques. --> White Hat SEO: briefly using real information. Intelligently take advantage of search engines, giving them the information they need and seek to locate the page in question. All this using real data, and there has to be content and verifiable by Google or another search engine proceeds to qualitatively enrich the content of the network. It is ethical optimization. --> Grey Hat SEO: this is the grey zone, it means is a mix also behind. (you find this in shops, example zalando).
Seitenreport, here we can find an scan with all the diagnosis from a website, for example: problems with the programming.