Skocz do zawartości

DevStart Blogi

Członkowie
  • Postów

    1 369
  • Dołączył

  • Ostatnio

    Nigdy
  • Days Won

    2

Wszystko napisane przez DevStart Blogi

  1. Posted by rMaynes1 You are exposed to an average of 362 online display ads a day. How close are you to buying anything when you see those ads? Online display ads have been around for over 20 years. They’re nothing new. But over the past 2 decades, the content, format, and messaging of display ads have changed dramatically—because they have had to! The click-through rate of that first banner ad in 1994 was 44%. CTRs have steadily declined, and were sitting at around 0.1% in 2012 for standard display ads (video and rich media excluded), according to DoubleClick. Advertisers had to do something to ensure that their ads were seen, and engaged with—ads had to be a useful resource, and not an annoying nuisance. It’s important, however, that the focus is not firmly fixed on CTRs. Yes, online display ads have largely been considered a tool for direct response advertising, but more recently, advertisers are understanding the importance of reaching the right person, in the right mindset, with an ad that can be seen. This ad may not be clicked on, but does that mean it wasn’t noticed and remembered? Advertisers are increasingly opting to pay for performance as opposed to clicks and/or impressions. Advertisers want their ad to drive action that leads to purchase—and that isn’t always in the form of a click. Mediative recently conducted and released a research study that looks at how display ads can drive purchase behaviour. If someone is browsing the web and sees an ad, can it influence a purchase decision? Are searchers more responsive to display ads at different stages in the buying cycle? What actions do people take after seeing an ad that captures their interest? Ultimately, Mediative wanted to know how indicative of purchase behaviour a click on an ad was, and if clicks on display ads even matter anymore when it comes to driving purchase behaviour and measuring campaign success. The results from an online survey are quite interesting. 1. The ability of online display ads to influence people increases as they come closer to a purchase decision. In fact, display ads are 39% more likely to influence web users when they are researching a potential purchase versus when they have no intent to buy. Advertiser action item #1: Have different ad creatives with different messaging that will appeal to the researcher and the purchaser of your product or service separately. Combined with targeted impressions, advertisers are more likely to reach and engage their target audience when they are most receptive to the particular messaging in the ad. Here are a few examples of Dell display ads and different creatives that have been used: This creative is focusing on particular features of the product that might appeal more to researchers. This ad injects the notion of “limited time” to get a deal, which might cause people who are on the fence to act faster—but it doesn’t mention pricing or discounts. These creatives introduce price discounts and special offers which will appeal to those in the market to buy. 2. The relevancy of ads cannot be understated. 40% of people took an action (clicked the ad, contacted the advertiser, searched online for more information, etc.) from seeing an ad because it was relevant to a need or want, or relevant to something they were doing at the time. Advertiser action item #2:Use audience data or lookalike modeling in display campaigns to ensure ads will be targeted to searchers who have a higher likelihood of being interested in the product or service. Retargeting ads to people based on their past activity or searches is valuable at this stage, as potential customers can be reached all over the web while they comparison shop. An established Canadian charitable organization ran an awareness campaign in Q2 2015 using retargeting, first and third party data lookalike modeling, and contextual targeting to help drive existing, and new users to their website. The goal was to drive donations, while reducing the effective cost per action of the campaign. This combination helped drive granularity in the targeting, enabling the most efficient spending possible. The result was a 689% decrease in eCPA—$76 versus the goal of $600. 3. Clicks on ads are not the only actions taken after seeing ads. 53% of people said they were likely to search online for the product featured in the ad (the same as those who said they would click on the ad). Searching for more information online is just as likely as clicking the ad after it captures attention, just not as quickly as a click (74% would click on the ad immediately or within an hour, 52% would search online immediately or within an hour). Advertiser action item #3:It is critical not to measure the success of a display campaign by clicks alone. Advertisers can get caught up in CTRs, but it’s important to remember that ads will drive other behaviours in people, not just a click. Website visits, search metrics, etc. must all be taken into consideration. A leading manufacturer of PCs, laptops, tablets, and accessories wanted to increase sales in Q2 of 2014, with full transparency on the performance and delivery of the campaign. The campaign was run against specific custom audience data focusing on people of technological, educational, and business interest, and was optimized using various tactics. The result? The campaign achieved a post-view ROI revenue (revenue from target audiences who were presented with ad impressions, yet did not necessarily click through at that time) that was 30x the amount of post-click revenue. 4. Clicks on ads are not the only actions that lead to purchase. 33% of respondents reported making a purchase as a direct result of seeing an ad online. Of those, 61% clicked and 44% searched (multiple selections were allowed), which led to a purchase. Advertiser action item #4:Revise the metrics you measure. Measuring "post-view conversions" will take into account the fact that people may see an ad, but act later—the ad triggers an action, whether it be a search, a visit, or a purchase—but not immediately, and it is not directly measurable. 5. The age of the target audience can impact when ads are most likely to influence them in the buying cycle.Overall, 18–25 year olds are most likely to be influenced by online advertising. At the beginning of the buying cycle, younger adults aged 18–34 are likely to notice and be influenced by ads much more than people aged over 35. At the later stages of the buying cycle, older adults aged 26–54 are 12% more likely that 18–25 year olds to have made a purchase as a result of seeing an ad.Advertiser action item #5:If your target audience is older, multiple exposures of an ad might be necessary in order to increase the likelihood of capturing their attention. Integrated campaigns could be more effective, where offline campaigns run in parallel with online campaigns to maximize message exposure. 6. Gender influences how much of an impact display ads have. More women took an online action that led to a purchase in the last 30 days, whereas more men took an offline action that led to a purchase. 76% more women than men visited an advertiser’s website without clicking on the ad. 47% more women than men searched online for more information about the advertiser, product, or service. 43% more men than women visited the advertiser’s location. 33% more men than women contacted the advertiser.Advertiser action item #6:Ensure you know as much about your target audience as possible. What is their age, their average income? What sites do they like to visit? What are their interests? The more you know about who you are trying to reach, the more likely you will be to reach them at the right times when they will be most responsive to your advertising messages. 7. Income influences how much of an impact display ads have. Web users who earned over $100k a year were 35% more likely to be influenced by an ad when exposed to something they hadn’t even thought about than those making under $50k a year. When ready to buy, people who earned under $20K were 12.5% more likely to be influenced by ads than those making over $100K.Advertiser action item #7:Lower earners (students, part-time workers, etc.) are more influenced by ads when ready to buy, so will likely engage more with ads offering discounts. Consider income differences when you are trying to reach people at different stages in the buying cycle. 8. Discounts don't influence people if they are not relevant. We were surprised that the results of the survey indicated that discounts or promotions in ads did not have more of an impact on people—but it’s likely that the ads with coupons were irrelevant to the searcher’s needs or wants, therefore would have no impact. We asked people what their reasons were behind taking action after seeing an online ad. 40% of respondents took an action from seeing an ad for a more purchase-related reason than simply being interested—they took the action because the ad was relevant to a need or want, or relevant to something they were doing at the time. Advertiser action item #8:Use discounts strategically. Utilizing data in campaigns can ensure ads reach people with a high intent to buy and a high likelihood of being interested in your product or service. Turn interest into desire with coupons and/or discounts—it will have more of an impact if directly tied to something the searcher is already considering. In conclusion, to be successful, advertisers need to ensure their ads are providing value to online web users—to be noticed, remembered, and engaged with, relevancy of the ad is key. Serving relevant ads that are related to a searcher’s current need or want are far more likely to capture attention than a "one-size-fits-all" approach. Advertisers will be rewarded for their attention to personalization with more interaction with ads and a higher likelihood of a purchase. Analyzing lower funnel metrics, such as post-view conversions, rather than simply concentrating on the CTR will allow advertisers to have a far better understanding of how their ads are performing, and the potential number of consumers that have been influenced. Rebecca Maynes, Manager of Content Marketing and Research with Mediative, was the major contributor on this whitepaper. The full research study is available for free download at Mediative.com. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  2. Oficjalnie premiera książki była we wtorek, tj. od wtorku powinny do Was docierać zamówione egzemplarze (i z tego co mi wiadomo, po licznych fotkach wrzucanych na fb/mikroblogach/twitterze/etc, faktycznie docierają mniej lub bardziej sprawnie, a pierwsze literówki już zostały odnalezione). W związku z tym stwierdziłem, że napisze jakiś nowy post z kilkoma informacjami różnymi związanymi z książką, jej dystrybucją, spotkaniami autorskimi, serwisem książki itp. O części z nich już pisałem tu i tam, ale uznałem, że dobrze będzie je zebrać z wpisów/komentarzy/fb/etc w jedno miejsce. ░▒▓ Spotkania autorskie Dla przypomnienia, będą trzy: ■ 7 listopada (sobota) - Warszawa - więcej informacji + rejestracja (RSVP, FCFS) ■ 9 listopada (poniedziałek) - Kraków - więcej informacji + rejestracja. UWAGA: wszystkie osoby które już się rejestrowały, są proszone o potwierdzenie przybycia korzystając z nowego formularza na podlinkowanej stronie (powinniście otrzymać mejla w tej sprawie wczoraj/przedwczoraj). Jeśli ktoś w poprzednim formularzu zaznaczył, że przyjdzie z kimś, osoby towarzyszące również powinny potwierdzić przybycie korzystając z nowego formularza. Osoby które się nie zapisały wcześniej, nadal mogą to zrobić korzystając z tego samego formularza (Michał z Sekuraka załatwił większą salę - kudos!). ■ 19 listopada (czwartek) - Wrocław - w poniedziałek ruszy rejestracja; może być dość mało miejsca (RSVP, FCFS) - wyglądajcie posta na blogu. Do zobaczenia IRL :) Jeśli chodzi o inne miasta - być może w 2016 uda się coś jeszcze zorganizować, ale o tym porozmawiamy na początku nowego roku. ░▒▓ Nakłady i dodruki Jak pisałem w poprzednim poście, pierwszy, rozszerzony nakład (2k egz.) rozszedł się jak świeże bułeczki - na 5 dni przed końcem przedsprzedaży musieliśmy ją chwilowo zablokować (do czasu zatwierdzenia dodruku). Pierwszy dodruk (1k egz.) został zatwierdzony i od wczoraj (środy) można ponownie zamawiać egzemplarze na stronie PWN w cenie z przedsprzedaży (tj. niecałe 50 PLN). Cena pozostanie niezmieniona do wyczerpania nakładu. Egzemplarze z dodruku będą gotowe do wysyłki 19 listopada - za opóźnienia przepraszam (szczerze, nikt nie spodziewał się, że pierwszy nakład zejdzie prawie "na pniu"). Plot twist: wg info, które dostałem dzisiaj (29.10) od wydawcy, zostało niecałe 300 sztuk. Drugi dodruk (1k egz.) został również już zatwierdzony. Będzie on już w normalnej cenie (70 PLN) i wysyłka egzemplarzy z tej puli rozpocznie się na przełomie listopada i grudnia. Jeśli nadal będzie zainteresowanie po wyczerpaniu drugiego dodruku, będę kolejne ofc. ░▒▓ E-book Będzie EPUB+Mobi. Spodziewana premiera: 1 grudnia. Jeśli chodzi o dokupienie ebooka / bundle: ■ Prawdopodobnie będzie opcja dokupienia ebooka po sporo niższej cenie dla osób, które kupiły książkę w przedsprzedaży. ■ Prawdopodobnie będzie możliwość kupienia ebooka+książki papierowej razem, po odpowiednio niższej cenie. ■ Sam e-book będzie kosztować 80% ceny książki, czyli około 56 PLN. ░▒▓ Serwis książki, flagi, literówki, bugbounty Serwis książki "się robi" (tzn. ja go robię wieczorami po pracy). Zdaję sobie sprawę, że powinien być już dawno gotowy i tylko odpalony w okolicach premiery, niestety (jak zwykle) nie doszacowałem czasu i są opóźnienia (I love deadlines. I love the whooshing noise they make as they go by. - Douglas Adams). W każdym razie, serwis książki znajduje się (chyba powinienem to w formie przyszłej, niedokonanej napisać) pod adresem: https://zrozumiecprogramowanie.pl/ Uwaga: nie ma wersji http:// - działa tylko https:// - fixnę to za kilka dni Większość linków z książki już działa, z dokładnością do dwóch ćwiczeń (powinny jutro być online). Brakuje również interaktywnej części, tj.: ■ Rejestracji/logowania/panelu czytelnika ■ Strony do zgłaszania flag + Hall of Fame ■ Strony do zgłaszania literówek + Hall of Fame ■ Strony od bugbounty + Hall of Fame + Erraty ■ Mini-forum Będę to stopniowo wieczorami dopisywać - za opóźnienia raz jeszcze przepraszam. Taka jedna rzecz - zachęcam do wstrzymania się ze zgłaszaniem literówek/flag/błędów merytorycznych - jak tylko pojawią się odpowiednie opcje w serwisie, to dane znalezisko będzie od razu dowiązane do Waszego konta, dzięki czemu Wy dostaniecie credit w Hall of Fame, a ja będę miał wszystko w jednym miejscu i nic nie zgubie. Dodam, że wzór pocztówki (za każdy znaleziony błąd merytoryczny, który dana osoba znalazła jako pierwsza, będę wysyłał pamiątkową pocztówkę) jest już gotowy i za kilka dni je zamówię w drukarni. Jak tylko przyjdą, to wrzucę fotkę. Standardowo, podziękowania dla xa za przystosowanie wzoru :) ░▒▓ Recenzje i opinie Jeszcze się żadne nie pojawiły. Jak się pojawią, podlinkuje je tutaj (a potem przeniosę linki na serwis książki). Swoją drogą, jeśli ktoś przeczytał już na tyle dużo, żeby wyrobić sobie opinię, to zachęcam do wrzucenia jej (jakakolwiek by nie była) na stronę PWN (zakładka "Opinie użytkowników" na dole strony). Widziałem, że trochę osób na sieci rozgląda się za opiniami. PS. Tak, wiem, wydawca nadal nie zresetował ocen po najeździe botów (bota/pętli while). Pingnę go po raz Nty za chwilę. OK, póki co tyle. Jak coś sobie przypomnę, to dopiszę :)Wyświetl pełny artykuł
  3. Atrybut InternalsVisibleTo służy do definiowania zaprzyjaźnionych bibliotek. “Zaprzyjaźniona” biblioteka to taka, która ma dostęp do klas i metod z modyfikatorem “internal”. Zwykle korzysta się z niego w celu przetestowania wewnętrznych klas. Czasami bywa, że klasy w bibliotece mają modyfikator internal i co za tym idzie, nie ma do nich bezpośrednio dostępu z testów. Za pomocą InternalsVisibleTo można zrobić wyjątek dla jakieś biblioteki, w tym przypadku projektu z testami. Wystarczy w pliku Assembly.cs biblioteki zawierającej klasy internal umieścić: [assembly:InternalsVisibleTo("AnyName.Tests") Od tego momentu, AnyName.Tests będzie mogło korzystać z wewnętrznych klas projektu, w którym znajduje się powyższy atrybut. Oczywiście należy pamiętać, żeby testy jednostkowe skupiały się na testowaniu zachowania, a nie internali. W wielu przypadkach, powyższy atrybut jest sygnałem, że testujemy nie to co trzeba. Czasami jednak, testowanie przez publiczne API może być zbyt skomplikowane i niedokładne. Osobiście używam czasami tego atrybutu, jeśli logika w klasach wewnętrznych jest zbyt bardzo skomplikowana, aby testować ją wyłącznie przez publiczne API. Jeśli biblioteka jest typu strong-named (podpis cyfrowy), wtedy musimy podać pełny klucz publiczny obok nazwy. Obydwie biblioteki (logika oraz testy) muszą być zatem podpisane. Wtedy obok nazwy, podajemy również klucz publiczny, na przykład: [assembly:InternalsVisibleTo("TestCoverage.Tests,PublicKey=002400000480000094000000060200000024000052534131000400000100010085d32843e5e1f42acd023289 dacebe34befbf561bdbb163367bb727f9292824db5aac63c7e72e45e273809937050d21230653def915ecc91e87d1eb4313cc4ed7357fd61d7698790901d1134ba34a9ce0f82f3dfb0e9bad 9c3120a3a6324a333718636b232f4a0b41c72428f2d8704d2da83edc496fe2325816bc8dfdad8feae")] Jak widzimy na powyższym przykładzie, wklejamy pełny klucz publiczny, a nie jego token. Wystarczy, że odpalimy Developer Command Prompt oraz użyjemy poniższej komendy: sn -Tp TestCoverage.Tests.dll Powyższa komenda wyświetli zarówno pełny klucz, jak i jego token. Wyświetl pełny artykuł
  4. Posted by jenstar If you're a webmaster, you probably received one of those infamous “Googlebot cannot access CSS and JS files on example.com” warning letters that Google sent out to seemingly every SEO and webmaster. This was a brand new alert from Google, although we have been hearing from the search engine about the need to ensure all resources are unblocked—including both JavaScript and CSS. There was definite confusion around these letters, supported by some of the reporting in Google Search Console. Here's what you need to know about Google’s desire to see these resources unblocked and how you can easily unblock them to take advantage of the associated ranking boosts. Why does Google care? One of the biggest complaints about the warning emails lay in the fact that many felt there was no reason for Google to see these files. This was especially true because it was flagging files that, traditionally, webmasters blocked—such as files within the WordPress admin area and Wordpress plugin folders. Here's the letter in question that many received from Google. It definitely raised plenty of questions and concerns: Of course, whenever Google does anything that could devalue rankings, the SEO industry tends to freak out. And the confusing message in the warning didn’t help the situation. Why Google needs it Google needs to render these files for a couple of key reasons. The most visible and well known is the mobile-friendly algorithm. Google needs to be able to render the page completely, including the JavaScript and CSS, to ensure that the page is mobile-friendly and to apply both the mobile-friendly tag in the search results and the associated ranking boost for mobile search results. Unblocking these resources was one of the things that Google was publicly recommending to webmasters to get the mobile-friendly boost for those pages. However, there are other parts of the algorithm that rely on using it, as well. The page layout algorithm, the algorithm that looks at where content is placed on the page in relation to the advertisements, is one such example. If Google determines a webpage is mostly ads above the fold, with the actual content below the fold, it can devalue the rankings for those pages. But with the wizardry of CSS, webmasters can easily make it appear that the content is front and center, while the ads are the most visible part of the page above the fold. And while it’s an old school trick and not very effective, people still use CSS and JavaScript in order to hide things like keyword stuffing and links—including, in the case of a hacked site, to hide it from the actual website owner. Googlebot crawling the CSS and JavaScript can determine if it is being used spammily. Google also has hundreds of other signals in their search algo, and it is very likely that a few of those use data garnered from CSS and JavaScript in some fashion as well. And as Google changes things, there is always the possibility that Google will use it for future signals, as well. Why now? While many SEOs had their first introduction to the perils of blocking JavaScript and CSS when they received the email from Google, Matt Cutts was actually talking about it three-and-a-half years ago in a Google Webmaster Help video. Then, last year, Google made a significant change to their webmaster guidelines by adding it to their technical guidelines: Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings. It still got very little attention at the time, especially since most people believed they weren’t blocking anything. However, one major issue was that some popular SEO Wordpress plugins were blocking some JavaScript and CSS. Since most Wordpress users weren’t aware this was happening, it came as a surprise to learn that they were, in fact, blocking resources. It also began showing up in a new "Blocked Resources" section of Google Search Console in the month preceding the mobile-friendly algo launch. How many sites were affected? In usual Google fashion, they didn’t give specific numbers about how many webmasters received these blocked resources warnings. But Gary Illyes from Google did confirm that they were sent out to 18.7% of those that were sent out for the mobile-friendly warnings earlier this year: Finding blocked resources The email that Google sent to webmasters alerting them to the issue of blocked CSS and JavaScript was confusing. It left many webmasters unsure of what exactly was being blocked and what was blocking it, particularly because they were receiving warnings for JavaScript and CSS hosted on other third-party sites. If you received one of the warning letters, the suggestion for how to find blocked resources was to use the Fetch tool in Google Search Console. While this might be fine for checking the homepage, for sites with more than a handful of pages, this can get tedious quite quickly. Luckily, there's an easier way than Google's suggested method. There's a full walkthrough here, but for those familiar with Google Search Console, you'll find a section called “Blocked Resources” under the “Google Index” which will tell you what JavaScript and CSS is blocked and what pages they're found in. You also should make sure that you check for blocked resources after any major redesign or when launching a new site, as it isn’t entirely clear if Google is still actively sending out these emails to alert webmasters of the problem. Homepage There's been some concern about those who use specialized scripts on internal pages and don’t necessarily want to unblock them for security reasons. John Mueller from Google said that they are looking primarily at the homepage—both desktop and mobile—to see what JavaScript and CSS are blocked. So at least for now, while it is certainly a best practice to unblock CSS and JavaScript from all pages, at the very least you want to make it a priority for the homepage, ensuring nothing on that page is blocked. After that, you can work your way through other pages, paying special attention to pages that have unique JavaScript or CSS. Indexing of Javascript & CSS Another reason many sites give for not wanting to unblock their CSS and JavaScript is because they don’t want them to be indexed by Google. But neither of those files are file types that Google will index, according to their long list of supported file types for indexation. All variations It is also worth remembering to check both the www and the non-www for blocked resources in Google Search Console. This is something that is often overlooked by those webmasters that only to tend to look at the version they prefer to use for the site. Also, because the blocked resources data shown in Search Console is based on when Googlebot last crawled each page, you could find additional blocked resources when checking them both. This is especially true for for sites that may be older or not updated as frequently, and not crawled daily (like a more popular site is). Likewise, if you have both a mobile version and a desktop version, you'll want to ensure that both are not blocking any resources. It's especially important for the mobile version, since it impacts whether each page gets the mobile-friendly tag and ranking boost in the mobile search results. And if you serve different pages based on language and location, you'll want to check each of those as well. Don’t just check the “main” version and assume it's all good across the entire site. It's not uncommon to discover surprises in other variations of the same site. At the very least, check the homepage for each language and location. Wordpress and blocking Javascript & CSS If you use one of the "SEO for Wordpress"-type plugins for a Wordpress-based site, chances are you're blocking Javascript and CSS due to that plugin. It used to be one of the “out-of-the-box” default settings for some to block everything in the /wp-admin/ folder. When the mobile-friendly algo came into play, because those admin pages were not being individually indexed, the majority of Wordpress users left that robots block intact. But this new Google warning does require all Wordpress-related JavaScript and CSS be unblocked, and Google will show it as an error if you block the JavaScript and CSS. Yoast, creator of the popular Yoast SEO plugin (formerly Wordpress SEO), also recommends unblocking all the JavaScript and CSS in Wordpress, including the /wp-admin/ folder. Third-party resources One of the ironies of this was that Google was flagging third-party JavaScript, meaning JavaScript hosted on a third-party site that was called from each webpage. And yes, this includes Google’s own Google AdSense JavaScript. Initially, Google suggested that website owners contact those third-party sites to ask them to unblock the JavaScript being used, so that Googlebot could crawl it. However, not many webmasters were doing this; they felt it wasn’t their job, especially when they had no control over what a third-party sites blocks from crawling. Google later said that they were not concerned about third-party resources because of that lack of control webmasters have. So while it might come up on the blocked resources list, they are truly looking for URLs for both JavaScript and CSS that the website owner can control through their own robots.txt. John Mueller revealed more recently that they were planning to reach out to some of the more frequently cited third-party sites in order to see if they could unblock the JavaScript. While we don’t know which sites they intend to contact, it was something they planned to do; I suspect they'll successfully see some of them unblocked. Again, while this isn’t so much a webmaster problem, it'll be nice to have some of those sites no longer flagged in the reports. How to unblock your JavaScript and CSS For most users, it's just a case of checking the robots.txt and ensuring you're allowing all JavaScript and CSS files to be crawled. For Yoast SEO users, you can edit your robots.txt file directly in the admin area of Wordpress. Gary Illyes from Google also shared some detailed robots.txt changes on Stack Overflow. You can add these directives to your robots.txt file in order to allow Googlebot to crawl all Javascript and CSS. To be doubly sure you're unblocking all JavaScript and CSS, you can add the following to your robots.txt file, provided you don't have any directories being blocked in it already: User-Agent: Googlebot Allow: .js Allow: .css If you have a more specialized robots.txt file, where you're blocking entire directories, it can be a bit more complicated. In these cases, you also need to allow the .js and.css for each of the directories you have blocked. For example: User-Agent: Googlebot Disallow: /deep/ Allow: /deep/*.js Allow: /deep/*.css Repeat this for each directory you are blocking in robots.txt. This allows Googlebot to crawl those files, while disallowing other crawlers (if you've blocked them). However, the chances are good that the kind of bots you're most concerned about being allowed to crawl various JavaScript and CSS files aren't the ones that honor robots.txt files. You can change the User-Agent to *, which would allow all crawlers to crawl it. Bing does have its own version of the mobile-friendly algo, which requires crawling of JavaScript and CSS, although they haven't sent out warnings about it. Bottom line If you want to rank as well as you possibly can, unblocking JavaScript and CSS is one of the easiest SEO changes you can make to your site. This is especially important for those with a significant amount of mobile traffic, since the mobile ranking algorithm does require they both be unblocked to get that mobile-friendly ranking boost. Yes, you can continue blocking Google bot from crawling either of them, but your rankings will suffer if you do so. And in a world where every position gained counts, it doesn’t make sense to sacrifice rankings in order to keep those files private. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  5. Posted by Bill.Sebald A process can easily become a habit. A habit may not change without awareness or intervention. Before it becomes a habit, a process should be adjusted to change along with new goals, constant learning, experimentation, and so on. Considering your time in analytics, are you engaging in a process, or in an outdated habit? That’s a real question that digital marketing practitioners should ask themselves. Inherently, marketers tend to be buried with work, reusing templates to speed up results. But many agencies lean on those templates a little too much, in my opinion. Templates should never be written in stone. If your company is pumping out canned reports, you’re not alone. I do the business development for our company and regularly ask prospects to explain or share the reports they’ve received in the past. Sometimes it’s truly discouraging, outdated, wasteful, and the reason businesses search for new SEO vendors. Look—I’m all for scalability. It’s a huge help. But some things can’t be scaled and still be successful, especially in today’s SEO climate—or, frankly, marketing in general. Much of what was scalable in SEO prior to 2011 is now penalty-bait. Today’s analytics tools and platforms can slice and dice data faster than anything Ron Popeil ever sold, but the human element will always be necessary if you want your marketing to dominate. Find the stories to tellI like to tell stories. I’m real fun in the pub. What I’ve always loved about marketing is the challenge to not only find a story, but have that story change something for the better. I like adding my layer based on real data and experimenting. Analytics work is all about finding the story. It’s detective work. It's equal parts Sherlock Holmes, Batman, and Indiana Jones. If you’re lucky, the story jumps out with very little digging. However, it’s more likely you’ll be going on some expeditions. It’s common to start with a hunch or random click through reports, but you need to always be looking for the story. A great place to start is through client conversations. We schedule at least one monthly call with our clients, where it’s truly a discussion session. We get conversations going to pull intel out of the key stakeholders. Case in point: Recently, we discovered through an open discussion that one of our clients had great success with an earlier email campaign targeted to business owners. There was specific information customers positively responded to, which was helpful in recent content development on their website. It's amazing what you can learn by asking questions and simply listening to responses. We should be true consultants, not report monkeys. Dive into the discussions started and enjoy the ride. I guarantee you’ll take note of a few ripe areas to review next time you log into your Google Analytics account. An impromptu survey says it’s a time issueMost SEO engagements are designed around a block of purchased hours. Hopefully the client understands they’re not only buying your time to complete SEO tasks, but also your expertise and analysis. If someone on your team were to say, “I don’t have time to do analysis because all my tasks used up their budget this month,” then you really need to question the value of the chosen tasks. Were they picked based on front-loaded analysis, or were they simply tasks pulled out of guesswork? A few weeks ago I pushed a quick Survey Monkey survey out on Twitter and Linkedin. Thanks to a few retweets, 94 people responded (please consider the following results more directional than scientific—I’m well aware it’s a shallow survey pool). I asked two questions: If you work in-house or have clients, how often do you log into your clients’ analytics? (Multiple choices ranged from several times a day to a few times a month).Do you, or do you not, get enough time in Analytics to interpret the data? The responses: While some do make a habit of logging into analytics once or more times a day, more do not. Is it required to check under the hood every day? Personally, I believe it is—but your answer may vary on that one. If something went south overnight, I want to be aware before my client tells me. After all, that’s one of the things I’m paid for. I like the idea of being active—not reactive. More notable is that most respondents didn’t feel they get enough time in analytics. That should absolutely change. There was also a field for respondents to elaborate on their selections. There were several comments that jumped out at me: “In house, day to day tasks and random projects prevent me from taking the deep dives in analytics that I feel are valuable.” “It’s challenging to keep up with the changes and enhancements made in Google Analytics in particular, amongst other responsibilities and initiatives.” “Too many things are on my plate for me to spend the time I know I should be spending in Google Analytics.” “Finding the actionable info in Analytics always takes more time that expected—never enough time to crunch the numbers!” “I log in to 'spot check' things but rarely do I get to delve into the data for long enough to suss out the issues and opportunities presented by the data.”These results suggest that many marketers are not spending enough time with analytics. And possibly not because they don’t see the value, but simply because they don’t have time. “Either you run the day, or the day runs you (Jim Rohn)” is apropos here—you must make time. You need to get on top of all the people filling your plate. It’s not easy, but it needs to be done. Get on top of those filling your plate. Kind of like professional crowd surfing. Helpful resourcesDashboards are fantastic, but I rarely see them set up in analytics platforms. One of the best ways to get a quick glimpse of your key metrics are with dashboards. All good analytics platforms provide the ability to make custom dashboards. Get into work, grab a coffee, fire up the computer, click your dashboard bookmark. (I recommend that order!) Google Analytics, which most of us probably use, provides some decent options with their dashboards, though limited compared to enterprise analytics platforms. However, this basic dashboard is the minimum you should review in analytics. We’ll get deeper soon. Building these widgets are quite easy (I recently created a tutorial on my site). There are also websites that provide dashboards you can import into Google Analytics. Dashboard Junkie is a fun one. Here are some others from Econsultancy and Google themselves. It’s not just analytics platforms that offer dashboards. There are several other vendors in the SEO space that port in analytics data and mesh with their own data—from Moz Analytics to SearchMetrics to Conductor to many, many others. SEMrush has a unique data set that marketers should routinely review. While your traffic data in analytics will be truer, if you’re targeting pages you may be interested in monitoring keyword rank counts: Are backlinks a target? Maybe you’d find Cognitive SEO’s dashboard valuable: RankRanger is another SaaS we use. It’s become way more than just our daily rank tracking software. The data you can port in creates excellent snapshots and graphs, and strong dashboards: It also offers other graphing functionality to make pretty useful views: While some of the bigger platforms, like SearchMetrics and Conductor, make it easier to get a lot of information within one login, I’m still finding myself logging into several programs to get the most useful data possible. C’est la vie. Analytics is your vehicle to identifying problems and opportunityRemember, dashboards are simply the “quick and dirty” window into your site. They help spotlight drastic changes, and make your website’s general traction more visible. Certainly valuable for when your CMO corners you by the Keurig machine. It’s a state of the union, but doesn’t focus on subsections that may need attention. Agencies and consultants tend to create SEO reports for their clients as a standard practice, though sometimes these reports become extremely boilerplate. Boilerplate reports essentially force you to look under the same rocks month after month. How can you get a bigger view of the world if you never leave your comfortable neighborhood? A new routine needs to be created by generating new reports and correlations, finding trends that were hidden, and using all the tools at your disposal (from Analytics to link tools to competitive tools). Your analytics app is not a toy—it’s the lifeblood of your website. Deeper dives with Google AnalyticsGrouped pages lookupA quick way to look at chunks of the site is by identifying a footprint in the URL and searching with that. For example, go to Behavior > Site Content > All Pages or Landing Pages. Then, in the search bar right below the graph, search for the footprint. For example, take www.mystoreisdabomb.com/blog/2015/ as a real URL. if you want to see everything in the blog, enter */blog/ into the search bar. This is especially useful in getting the temperature of an eCommerce category. Segment sessions with conversions/transactionsSo often in SEO we spend our time analyzing what’s not working or posing as a barrier. This report helps us take a look at what is performing (by leads or sales generated) and the customer behavior, channels, and demographic information that goes along with that. Then we can identify opportunities to make use of our success and improve our overall inbound strategy. Below is a deeper dive into the conversions “Lead Generation” segment, although these same reports can just as aptly be applied to transactions. Ultimately, there are a lot of ways to slice and dice the analysis, so you’ll have to know what makes sense for your client, but here are three different reports from this segment that provided useful insights that will enhance our strategy. Conversions One of the easy and most valuable ones! Directions: Under any report, go to Add a Segment > Sessions with Conversions > Apply. Demographics – age, gender, location For example, our client is based in Pennsylvania, but is receiving almost as many request form submissions from Texas and New York, and has a high ratio of request form submissions to visitors for both of these other states. Given our client’s industry, this gives us ideas on how to market to these individuals and additional information the Texans may need given the long distance. Mobile – overview, device type, landing pages For this client, we see more confirmation of what has been called the “micro-moment” in that our mobile users spend less time on the site, view less pages per visit, have a higher bounce rate, and are more likely to be new users (less brand affinity). This would indicate that the site is mobile optimized and performing as expected. From here, I would next go into mobile traffic segments to find pages that aren’t receiving a lot of mobile traffic, but are similar to those that are, and find ways to drive traffic to those pages as well. Acquisition Here we’re looking at how the inbound channels stack up for driving conversions. Organic and Paid channels are neck and neck, although referral and social are unexpected wins (and social, glad we’ve proven your viability to make money!). We’ll now dig deeper into the referring sites and social channels to see where the opportunities are here. Assisted conversionsThere’s more to the story than last click. In Analytics, go to Conversions > Multi-Channel Funnels > Assisted conversions. Many clients have difficulty understanding the concept of attribution. This report seems to provide the best introduction to the world of attribution. Last click isn’t going to be replaced anytime soon, but we can start to educate and optimize for other parts of the funnel. True stories from analytics detective workGranted, this is not a post about favorite reports. But this is a post about why digging through analytics can open up huge opportunities. So, it’s real-life example time from Greenlane’s own experience! Story 1: The Forgotten Links The client is a big fashion brand. They’ve been a popular brick-and-mortar retail destination since the early 80s, but only went online in 1996. This is the type of company that builds links based on their brand ambassadors and trendy styles. SEO wasn’t the mainstream channel it is today, so it’s likely they had some serious architecture changes since the 90s, right? For this company, analytics data can only be traced back about seven years. We thought, “Let’s take a look at what drove traffic in their early years. Let’s see if there were any trends that drove volume and sales where they may be slipping today. If they had authority then, and are slipping now, it might be easier to recoup that authority versus building from scratch.” The good news—this brand had been able to essentially maintain the authority they launched with, as there were not any real noticeable gaps between search data then and search data today. But, in the digging, we uncovered a gem. We found a lot of URLs that used to draw traffic that are not on their tree today. After digging furthur, we found a redesign occurred in the late 90s. SEO wasn’t factored in, creating a ton of 404s. These 404s were not even being charted in Google Webmaster Tools, yet they are still being linked to today from external sites (remember, GWT is still quite directional in terms of the data they provide). Better yet, we pulled links from OSE and Majestic, and saw that thousands of forgotten links existed. This is an easy campaign—create a 301 redirect matrix for those dead pages and bring those old backlinks to life. But we kept wondering what pages were out there before the days where analytics was implemented. Using the Wayback Machine, we found that even more redesigns had occurred in the first few years of the site’s life. We didn’t have data for these pages, so we had to get creative. Using Screaming Frog, we crawled the Wayback Machine to pull out URLs we didn’t know existed. We fed them into the link tools, and sure enough, there were links there, too. Story 2: To “View All” or Not To “View All” Most eCommerce sites have pagination issues. It’s a given. A seasoned SEO knows immediately to look for these issues. SEOs use rel=”next” and “prev” to help Google understand the relationships. But does Google always behave the way we think they should? Golly, no! Example 2 is a company that sells barware online. They have a lot of products, and tend to show only “page 1” of a given category. Yet, the analytics showed instances where Google preferred to show the view all page. These were long "view all" pages, which, after comparing to the “page 1” pages, showed a much lower bounce rate and higher conversions. Google seemed to prefer them in several cases anyway, so a quick change to default to “view all” started showing very positive returns in three months. Story 3: Selling What Analytics Says to Sell I have to change some details of this story because of NDAs, but once upon a time there was a jewelry company that sold artisan products. They were fond of creating certain kinds of keepsakes based on what sold well in their retail stores. Online, though, they weren't performing very well selling these same products. The website was fairly new and hadn't quite earned the footing they thought their brand should have, but that wasn't the terminal answer we wanted to give them. Instead, we wanted to focus on areas they could compete with, while building up the entire site and turning their offline brand into an online brand. Conversion rates, search metrics, and even PPC data showed a small but consistent win on a niche product that didn't perform nearly as well in the brick-and-mortar stores. It wasn't a target for us or the CEO. Yet online, there was obvious interest. Not only that, with low effort, this series of products was poised to score big in natural search due to low competition. The estimated search volume (per Google Keyword Planner) wasn't extraordinary by any stretch, but it led to traffic that spent considerable dollars on these products. So much so, in fact, that this product became a focus point of the website. Sometimes, mining through rocks can uncover gold (jewelry pun intended). ConclusionMy biggest hope is that your takeaway after reading this piece is a candid look at your role as an SEO or digital marketer. You’re a person with a “unique set of skills,” being called upon to perform works of brilliance. Being busy does create pressure; that pressure can sometimes force you to look for shortcuts or “phone it in.” If you really want to find the purest joy in what you’ve chosen as a career, I believe it’s from the stories embedded within the data. Go get ’em, Sherlock! Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  6. Posted by bridget.randolph As mobile technology becomes an increasingly common way for users to access the internet, you need to ensure that your mobile content (whether on a mobile website or in a mobile app) is as accessible to users as possible. In the past this process has been relatively siloed, with separate URLs for desktop and mobile content and apps tucked away in app stores. But as app and mobile web usage continues to rise, the ways in which people access this content is beginning to converge, which means it's becoming more important to keep all of these different content locations linked up. This means that the way we think about managing our web and mobile content is evolving: So how do we improve the interaction between these different types of content and different platforms, getting to the point of being able to have a single URL which takes the user to the most appropriate version of the content based on their personal context? The first step is to ensure that we are correctly implementing deep linking (e.g., linking to a particular screen within an app) for apps which have comparable webpage content, to allow for our app content to rank in mobile search. Image credit: Google Developers Google indexation provides benefits for both Android and iOS apps. The benefits for Android apps are twofold: users searching on an Android device who have not yet installed your app will see the app show up in mobile search results; and Android users who do have your app installed will get query autocompletions when they use browser search which can include results from your app, as well as seeing enhanced display elements in the SERP (such as the app icon). It’s basically like rich snippets for apps. Image credit: Google Developers On iOS, app ranking is currently only supported for apps already installed on the device. Apple users should see search results which include links to installed apps and also include the enhanced display elements mentioned above. In addition, Google recently announced that mobile apps which use the new App Indexing API for deep linking may receive a rankings boost in mobile web search. They are releasing a new and improved version of Google Now, "Now on Tap," in their latest OS update (Android M), which allows you to search content across your phone without navigating out of whatever app (or website) you are currently using. The catch is, that app content has to be in their index in order to be included in a "Now on Tap" search. It’s not just Google, either; Apple is implementing their own version of a search index to allow iOS9 users to search and discover web and app content without using a third-party search engine, Bing has its own approach to app indexation and ranking, and other services aren’t far behind. This post, however, will focus on how to setup your Android and iOS apps to appear in Google search results. While the idea of app indexation isn’t new, it is an area of rapid innovation and the process for getting your apps indexed by Google has recently been simplified. This post is therefore intended to provide a brief overview of that process and to serve as an update to the information which is currently available. The implementation The good news is that it’s getting simpler to add the relevant markup to your web content and get your app content indexed and ranking in mobile search results. The basic process is only three steps: Support HTTP deep links in your mobile app. For iOS you will need to do this by setting up support for "Universal Links." "Universal Links" are what Apple calls HTTP links that have a single URL which can open both a specific page on a website and the corresponding view in an app. Note: At this point, you can register your app with Google, associate it with your website and stop there—as long as you are using the same URLs for your web content and your app content, they should be able to automatically crawl, index, and attempt to rank your app content based on your website’s structure. However, implementing App Indexing and explicitly mapping your web content to your app content using on-page markup can provide additional benefits and allow for a bit more control. Therefore, I recommend following the full process, if possible. Implement Google App Indexing using the App Indexing API for Android, or by integrating the App Indexing SDK for iOS 9. Explicitly map your web pages to their corresponding app screens using either a rel=alternate link element on the individual page, by referencing the app URLs in your XML sitemaps, or by using schema.org markup. You can find a more step-by-step explanation of this process (looking at Android and iOS separately) below. The app indexation process used to be a bit more complex, because HTTP links aren’t supported by older iOS versions. Instead, developers had to use something called "Custom URL Schemes" to link to iOS app content. This meant that you essentially had to create a unique scheme for your app URLs and then add support for these in the app code. Custom URL schemes have a couple other downsides besides adding complexity, namely: different app developers can claim the same custom URL scheme, whereas with HTTP links you can associate the app to a particular domain or set of domains; and with custom URL schemes, tapping the URL when the app isn’t installed results in a broken link (because it only links to content within the app), whereas HTTP links are web links as well and can take the user to a webpage if the app isn’t installed (as long as the URL is the same for both the app view and the corresponding webpage). While you can still use the custom URL scheme approach, the good news is that Google’s App Indexing is now compatible with HTTP deep link standards for iOS 9, which Apple calls "Universal Links." You should still add markup to any webpages which have content corresponding to a particular app screen. Think of it like like rel=canonical or like mobile switchboard tags, but for apps. Be aware that when Google finds a link between a webpage and an app page which they think are equivalent, they will compare the two pages and you will receive a ‘Content Mismatch’ error in the Search Console if they don’t believe the content is similar enough. Getting Android apps indexed in Google Step 1: Support HTTP deep links in your app by adding intent filters to your manifest. An intent filter is a way of specifying how an app responds to a particular action. Intent filters for deep links have three required elements: <action>, <category>, and <data>. You can find more guidance on this from Google Developers. Here is their example of an intent filter which enables support for HTTP deep links: <intent-filter android:label="@string/filter_title_viewrecipes"> <action android:name="android.intent.action.VIEW" /> <category android:name="android.intent.category.DEFAULT" /> <category android:name="android.intent.category.BROWSABLE" /> <data android:scheme="http" android:host="recipe-app.com" android:pathPrefix="/recipes" /> </intent-filter> </activity> Noindex option: Just like for websites, you can add noindex directives for app content as well. Include a noindex.xml file in your app to indicate which deep links should not be indexed, and then reference that file in the app’s manifest (AndroidManifest.xml) file. You can find more detail on how to create and reference the noindex.xml file here. Step 2: Associate your app to your site in Google Search Console. This is done in Google Search Console (you can also do it from the Developer Console). As long as your app is set up to support deep links, this step is technically all you have to do to allow Google to start indexing your app. It will allow Google to index and crawl your app automatically by attempting to figure out the app structure from your website structure. However, if you do stop here, you will not have as much control over how Google understands your content, which is why the explicit mapping of pages to app versions is recommended. Also, if you can’t use the API for some reason, you need to make sure that Googlebot can access your content. You can check that this is configured correctly in your site’s robots.txt file by testing some of your deep links using the robots.txt tester tool in the Search Console. Step 3: Implement app indexing using the App Indexing API. Using the App Indexing API is definitely worthwhile; apart from anything else, apps which use the API should receive a rankings boost in mobile search results, and you don’t need to worry about Googlebot struggling to access your content. The App Indexing API allows you to annotate information about the activities within your app that support deep links (as laid out in your intent filters). For details on how to set this up, see the Google Developers guidance. Step 4: Test your implementation. You can test your implementation (always on a fresh installation of your app!) with the following tools. (Find more info about how to use each of these tools here.) Android Debug Bridge – to test deep links from the command line Fetch as Google (Search Console) – to test what Google sees when it crawls your app deep links You can also track search traffic to these deep links in the Search Console’s Search Analytics report. Getting iOS apps indexed in Google Step 1: Support HTTP deep links in your app by setting up support for "Universal Links." To support universal links in your iOS app, you need to first ensure that your app handles these links correctly by adopting the UIApplicationDelegate methods (if it doesn’t already use this protocol). Once this is in place, you can associate your app with your domain. You’ll do this by: adding an "associated domains" entitlement file to your app’s project in XCode that lists each domain associated with your app; and uploading an apple-app-site-association file to each of these domains with the content your app supports—note that the file must be hosted at the root level and on a domain that supports HTTPS. To learn more about supporting Universal Links, view the Apple Developer guidance. Step 2: Register your app with Google (using the GoogleAppIndexing SDK for iOS 9). You’ll need to add the App Indexing SDK to your app using the CocoaPods dependency manager. For step by step instructions, check the Google Developers’ guide. Basically what this does is allows you to register your app with Google, just like Android apps are registered via the Search Console. This also means that Google can now read the apple-app-site-association file to understand what URLs your app can open. Step 3: Test your implementation. You can test whether this is set up correctly by tapping a universal link in Safari on an iOS 9 device and checking that it opens the right location in your app. Mapping your webpages to your app with on-page markup or sitemaps Once you’ve set up the deep linking support for your Android and/or iOS app(s), the final step is to explicitly identify the corresponding webpages to the correct app screens using one of the supported markup options. This step allows you to indicate more clearly to Google what the relationship is between a given page and its corresponding app link (both of which should already share the same URL if you are using HTTP links). Following this step also allows you to indicate the relationship to Bing crawlers, which otherwise wouldn’t see the app content, and to allow Apple to index your iOS app. You can do this mapping either in the head of the individual page using a link element, using schema.org markup (for Android only), or in an XML sitemap. A note on formats for app links The format for an Android HTTP link uses the format of: // The {package_name} is the app’s "Application ID," which is how it is referenced in the Google Play Store. So a link to the (example) Gizmos app might look like this: // For iOS links, you use the app’s iTunes ID instead of the Package Name. So an iOS app URL uses this format: // For HTTP links the {scheme} is "http," which would mean your URL would look like this: // How to reference your app links Note: Google provides guidance on the three currently supported deep link methods here. Option 1: Link rel=alternate element To add an app link reference to an individual page, you can use an HTML <link> element in the <head> of the page. Here is an example of how this might look if you have both an iOS and Android app: <html> <head> ... <link rel="alternate" href="android-app://com.gizmos.android/http/gizmos.com/example" /> <link rel="alternate" href="ios-app://123456/http/gizmos/example" /></head> <body> … </body> Option 2: Schema.org markup (currently supported on Android only) Alternatively, if you have an Android app, you can use schema.org markup for the ViewAction potential action on an individual page to reference the corresponding app link. Here is an example of how this might look: script type="application/ld+json"> { "@context": "http://schema.org", "@type": "WebPage", "@id": "http://gizmos.com/example", "potentialAction": { "@type": "ViewAction", "target": "android-app://com.gizmos.android/http/gizmos.com/example" } } </script> Option 3: Add your app deep links to your XML sitemap Instead of marking up individual pages, you can use an <xhtml:link> element in your XML sitemap, inside the <url> element specifying the relevant webpage. Here is an example of how this would look if you have both an iOS and an Android app: <?xml version="1.0" encoding="UTF-8" ?> http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"> <url> http://gizmos.com/example <xhtml:link rel="alternate" href="ios-app://123456/http/gizmos/example" /></url> <xhtml:link rel="alternate" href="android-app://com.gizmos.android/http/gizmos.com/example" /> ... </urlset> Additional information What about apps which don’t have corresponding web pages? Unfortunately, as of this writing, Google does not officially offer app indexation for apps which don’t have corresponding web content. However, they are trying to move in this direction, and as such are beginning to try this out with a handful of apps with “app-only” content. If you have an app with app-only content, and would like to get this content indexed, you can express interest using this form. What about getting my app indexed in Bing? Bing supports two open standard options for linking webpages to app links: App Links Schema.org To learn more about how to implement these types of markup, see the guidance on the Bing blog. Quick reference checklists Will Critchlow recently spoke about app indexation in his presentation at Searchlove London. He provided two useful checklists for Android and iOS app indexing: Image source: http://www.slideshare.net/DistilledSEO/searchlove-... To learn more about app indexing by Google, check out Emily Grossman and Cindy Krum’s excellent post over on SearchEngineLand. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  7. Dzisiaj powracamy do zagadnień związanych z bezpieczeństwem aplikacji webowych. Przez kilka następnych postów będę pisał o XSS. Oprócz SQL Injection, XSS jest jednym z “popularniejszych” ataków przeprowadzanych na aplikacje webowe. O ile zasada działania może wydawać się prymitywna, to wiele stron, nawet tych z czołówki (np. Amazon), były podatne na XSS. Co więcej, tak jak […]Wyświetl pełny artykuł
  8. Jakiś czas temu wydawnictwo Helion udostępniło serwis videopoint, w którym oferuje różnego rodzaju kursy wideo dotyczące programowania, webmasterstwa itp. – ogólnie rzecz biorąc ten sam zakres wiedzy co oferowane przez nich książki. Osobiście byłem sceptyczny co do jakości dostępnych tam szkoleń, dlatego kiedy w moje ręce trafił kurs dotyczący AngularJS, z wielką ciekawością sprawdziłem jak... Czytaj dalej The post AngularJS. Kurs video – recenzja szkolenia na videopoint.pl appeared first on burczu programator. Wyświetl pełny artykuł
  9. Posted by randfish Outreach. It's arguably the most important part of the link building process—and also the most grueling. Good personalized outreach is impossible to scale effectively, and it's easy to fall into a rut. What should you be doing to maximize your success rate and to stand out from the crowd? In today's Whiteboard Friday, Rand offers up some methods of bartering value to earn genuine links, catching your target's attention, and gives actionable advice on what exactly you need to include in your outreach correspondence. Click on the whiteboard image above to open a high resolution version in a new tab! Video TranscriptionHowdy, Moz fans. Welcome to another edition of Whiteboard Friday. This week we're going to chat about link building outreach in this skeptical, jaded world in which we are forced to live as marketers. Look, I think we know a few things. This is really continuation from our Whiteboard Friday where we talked about the frustrating part of the flywheel where social shares are just not, on average, in most cases, getting us to the links that we need in order to rank. So we know a few things.It is still the case that links are well-correlated with higher rankings. It's still the case that nearly every site and page that ranks well in Google has some input that is related to their link profile, and sometimes that's stronger and sometimes it's a weaker influence. But we know we need links to rank well, especially in competitive search results. It's incredibly rare to earn those links just by publishing content and sharing it socially. Getting it in front of an audience, unless your audience is extremely link-likely and you've already built up some authority, and linking is a behavior that you've acclimated your community to, this is really, really tough. It's not going to work on its own. We also know that link outreach is a hard, grueling, manual process. There's no doubt about that. This is frustrating. That's why many of us try and use social sharing or subscriptions or publication to attempt to end around that need for direct link outreach because it's such a challenge. But what we need to talk about...I know that many of you in the comments and over Twitter mentioned this is: What actually works for link outreach, and how can we make that process less painful and more likely to have success? I think the reality is that outreach fundamentally involves an exchange of value. As you're going out and attempting to earn a link from someone directly through link outreach, through that one-to-one relationship, whether that's happening on social media or happening in person or happening on the phone or happening via email, whatever it is, if you don't provide value, if you're simply asking for something, your success rate is going to be extremely low compared to the folks who do provide value prior to the link or, better, as part of the link. The link is the way the value is exchanged. That's actually what Google is looking for. They're not looking for someone who's very successful at convincing someone to give them a link for no particularly good reason. They're looking for an exchange of value, where someone says, "Gosh, it provides value to my site and my visitors to link to this resource, and therefore I want to do it." Some things that are often perceived to carry real valueValue can be a bunch of different things. Value could be in the ego that it boosts. It could be in the problems that it helps solve. It could be in the form of what you've given them in exchange. Lots of things. So these are things that we've seen that are often perceived to carry real value, and some of these are taken directly from the BuzzSumo and Moz study, where we looked at things that earned social shares and also earned links, and there were some good cases of those and some different types of content. Some of this is also things that inherently earn links as it is used, things like embeddable content. So I'll talk through these because I think fundamentally when it comes down to it, it's very tough for me to stand up here and say, "Oh, we did some research." I saw this at a conference recently. I think it was Searchlove, where someone noted one of the things that we've been doing that's had much better success with our link outreach is we reach out asking if they would like to see the piece that we want them to link to rather than sharing the piece with them directly. That gets a much higher email engagement rate like, "Yeah, okay, I'll take a look at it." Then when we do it send to them, those people tend to look at it and link to it more than if we just sent them the link right off the bat because we've engaged in that conversation. Okay, look, there are a lot of tactical tips like that. But if that fundamental thing, that piece that you're providing that value to the potential linker doesn't carry real value in their eyes, you can't have any success, and that's why these items I think are so critical, so fundamental to the outreach formula. Unique research, and we've seen research perform very well. I think because unique research that provides value to entities and organizations and to content creators needs to be referenced. It needs that citation, and I think that's why research, especially research that you do and/or visuals or riffs that you take off of research that's already been created to analyze that data or to turn it into great graphs or interactive infographics or those kinds of things can provide real value.Well, I'm jaded about infographics personally, but I do believe that a lot of customized, high quality visuals can work, and certainly infographics can be a form of that, that do work in some sectors. I think that we're seeing that in tech and in marketing and in legal, and in a lot of places where you see a tremendous amount of outreach, infographics are actually losing out because they've been so saturated, and every content creator in those niches has 10 people reaching out to them every week offering a new infographic. You're just not standing out from the crowd. But I do think there are other forms of visuals, everything from photography to illustrations to customized graphics and charts, to drawings that can be very valuable there. That's why I've mentioned it here.Embeddable content is wonderful because it naturally acquires that link by saying like, "Hey, here's a calculator or here's a tool that you can embed on your site if you'd like to." You get that link back as part of the embed, and I think that can work great.We've also seen a decline, actually. Embeddable content used to be all the rage, say, 6 to 10 years ago. It's actually waned a little bit, and for that reason, I think can be more powerful, can stand out a little when it is used. So I think that's a tactic that I would encourage folks to try again. Badges are a form of this, but they're the most mild, least uniquely valuable form of that. So if you're going to do a badge, it better be a badge back to something that is very powerful or really, really triggers a great commitment. So if you're an Etsy top seller and you get a badge to put on your website or an embeddable widget from your Etsy store so that people can buy directly from Etsy from your website, okay, those things provide real value, and, of course, I'm going to link to them. But just a badge that's like, "I think you're a great blogger." Tough. APIs and data plus business development. These are tough things to build, but it can be very valuable. If you're providing data on an ongoing basis, especially to large organizations or powerful entities who are using that data, either publicly or even privately, very often you can include in those agreements some form of a, "Hey, we'd like some co-branding. We'd like you to link back to us. We'd like you to say the data was provided by us." Hard to do, but that's a great thing because it's powerful and it gets that link.Content that makes, well, your target look good. If you are inherently saying, "Hey, here's a piece of content. We did a truly substantive analysis of 5 or 10 players in the field. Your product, your service, you, your company, your content stands out in this way, and we've quantified that, and we've produced this piece." Yeah, I'm going to be much more likely to link to that than just a, "Here's a badge that says we like you." So I think these can still work well, and playing to people's ego can still work well.Guest content. We see guest content still doing very well despite Google's warnings about guest blogging. Of course, we talked about that couple of years ago on Whiteboard Friday. Guest content is still very powerful. It almost always includes a link back. The key is that this content has to actually provide value to the target. I think if that content does provide great value to the target, you can get a link from almost anywhere. The key is convincing them that it's going to perform well for them and going to perform well with their audience.As a result, it's very easy for folks who already have a platform, who are already thought of as influencers and thought leaders, to get their content on to other sites. It's much tougher as an unknown, and this is one of the reasons why I think building up your platform first and then leveraging guest content can be so valuable. Last one that I'll mention here is a service or favor that makes your target want to refer people to you. Now this is a challenging thing to accomplish, but if you are a service provider, a content provider, a data provider, or a product provider who has done something amazing and unusual, something that makes you stand out in the minds of a customer, and you know that customer has a website, and that customer could be a business or an organization, an entity, those kinds of things, and you know that that organization often deals with people who need services like yours, reaching out and saying, "Hey, we'd love it if you'd refer folks, and here's what we're willing to offer," and those kinds of things can be another great way to go. The outreach email itselfThis is the thing that gets talked about a lot, and I hear the same advice over and over again around link outreach. I get a little frustrated sometimes. It need to be customized and well-written, and you need to flatter your target, and it shouldn't be automated. Those things, that's just table stakes. That is merely send a good, competent email. That is not advice or tactical, useful, actionable advice. I get very frustrated when I see those same pieces of advice over and over again. I think where you want to go is places that other outreach emails don't go. So if you can, try and look at a dozen or a hundred outreach emails from other sources to people like those in your target market. You know that they've received those emails. You can even reach out to people in your audience who you already have a relationship with. I'm sure you have some relationships like that with people, who are influencers in your field already, and ask them, "Hey, can you send me the outreach emails that you get? I just want to take a look at them, because I think they're all terrible, and I never want to do that to people. I'll send you mine." What you will find is that they are almost never authentic. They're rarely humble. They almost never create a real connection. In fact, the vast majority of real connection emails that I get from folks that I've never met before are not about outreach, and I think that's what forms that real connection. I've seen a few of these that are outreach emails, but they do create a real connection, like they have actually read things that I've made and watched them, or been at events that I've been at, or worked with companies that I've worked with, or whatever it is, and they form that real connection in the email authentically. They need to stand out as unique. Unique meaning they don't look like those other 150 outreach emails. This is the sucky part. These outreach emails do not scale. The ones that work tend not to scale, and it tends to be a link builder's job to scale this process, because you need lots of links, you need it to point to lots of pieces of content, and so you're always looking for scale. I would urge you to go the opposite direction. Narrow your funnel. Worry less about the number of people you're targeting and more about the success rate, because once you get the success rate high, you can turn up the volume really fast. But if your success rate is low and there's a limited market of influencers in your field, you can quickly burn all of them with your outreach before you ever have a chance to get good at it. Link outreach is supposed to be hard.This process is not supposed to scale. If it scaled, it would be easier. Everyone would do it, and there'd be no competitive barrier, no competitive advantage to being great at building and earning links. So I think this frustration exists in the world. I want to recognize that and have empathy for it and for all of you who have to do link building outreach, but I also want to say that's part of the magic that happens here. So you should account for it and expect it and not fear it. All right, everyone. I look forward to your comments. I'd love to hear your link building outreach strategies and tactics, what's worked for you, what hasn't. We'll see you again next week for another edition of Whiteboard Friday. Take care. Video transcription by Speechpad.com Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  10. Zgodnie z przewidywaniami, o których pisałem na FB kilka dni temu, pierwotny (już raz rozszerzony) nakład mojej książki rozszedł się w całości (na marginesie: WOW - dzięki za zaufanie!). Mój wydawca od razu umówił dodruk w drukarni, natomiast chwilę to potrwa i kolejny nakład będzie gotowy dopiero w drugiej połowie listopada. A więc wysyłka egzemplarzy zamówionych od mniej więcej dnia wczorajszego nastąpi dopiero po ich wydrukowaniu - za opóźnienia bardzo przepraszamy. Z uwagi na to książka będzie niedostępna w przedsprzedaży od dzisiaj do poniedziałku. W poniedziałek sprzedaż (w tym ostatni dzień przedsprzedaży - w końcu we wtorek premiera ;>) ruszy ponownie, właśnie tych egzemplarzy, które będą drukowane w listopadzie. Raz jeszcze przepraszam za problem (choć od strony autora - it's a good problem to have). P.S. Pojawiło się nagranie z mojej prelekcji z PyConPl'15 sprzed tygodnia.Wyświetl pełny artykuł
  11. W .NET 4.5 pojawiła się metoda Task.Run. Z przyzwyczajenia jednak przez długi czas używałem tylko Task.Factory.StartNew. Obie metody służą do stworzenia nowego wątku i natychmiastowego jego uruchomienia. Sposób wywołania wygląda bardzo podobnie: Zajrzyjmy do zdekompilowanego kodu Task.Run: Oznacza to, że Task.Run to nic innego jak: Spróbujmy jednak rozszyfrować co powyższe parametry oznaczają. W przypadku CancellationToken.None […]Wyświetl pełny artykuł
  12. Posted by MarkTraphagen Recently, Google Webmaster Trends analyst Gary Illyes surprised many of us with a remark he made during his keynote Q&A with Danny Sullivan at SMX East in New York City. Illyes said that he recommended webmasters not remove the rel=author tag from their site content. Google had used rel=author as part of its Google Authorship feature that (potentially) displayed a special author rich snippet in search results for content using the tag. Google ended support of this feature in August 2014. The phrase that made everyone sit up and say, "Did he just say that?" was this: "...because it is possible Google might make use of [rel=author] again in the future." Even though Google's John Mueller made the same recommendation after he announced that Google was no longer making use of Google Authorship in search (to be precise, Mueller said leaving the tag in place "did no harm"), Illyes's statement seemed to shock many because Google has said nothing about Google Authorship or the rel=author tag since they said they stopped supporting it. In a subsequent Twitter exchange I had with Gary Illyes, he explained that if enough users are implementing something, Google might consider using it. I asked him if that meant specifically if more people started using rel=author again, that Google might make use of it again. Illyes replied, "That would be safe to say." Before I provide my commentary on what all this means, and whether we should expect to see a resumption of Google Authorship in Google Search, let me provide a brief overview of Authorship for anyone who may not be familiar with it. If you already understand Google Authorship, feel free to skip down to the Will Google Bring Back Authorship? section. A brief history of Google Authorship Google Authorship was a feature that showed in Google Search results for about three years (from July 2011 until August 2014). It allowed authors and publishers to tag their content, linking it to an author's Google+ profile, in order to provide a more-certain identification of the content author for Google. In return, Google said they might display an authorship rich snippet for content so tagged in search results. The authorship rich snippet varied in form over the three years Authorship was in use, but generally it consisted of the author's profile photo next to the result and his or her byline name under the title. For part of the run of Authorship, one could click on an author byline in search to see results showing related content from that author. Google Authorship began with an official blog post in June of 2011 where Othar Hansson announced that Google would begin supporting the rel=author tag, but with no specifics on how they might use it. Then in a July 2011 video, Hansson and Matt Cutts explained that Google+ would be the hub for author identification, and that Google might start showing a special Authorship rich snippet result for properly tagged content. Those rich snippets slowly began appearing for more and more authors using rel=author over the next several months. During the three years of the program, Google experimented with many different configurations of the rich snippet, and also which authors and content would get it in response to various search queries. Interest in Google Authorship from the SEO and online marketing communities was spurred even more by its possible connection to Google's Agent Rank patent, first revealed by Bill Slawski. In this patent, Google described a system by which particular "agents" or "entities" could be identified, scored by their level of authority, and that score then be used as a search ranking factor. Since one of the types of agents identified in the patent was a content author, the patent rapidly became known as "author rank" in the SEO community. The connection with Authorship in particular, though, came from Cutts and Hansson stating in the above-mentioned Authorship video that Google might someday use Authorship as a search ranking factor. Speculation about so-called Author Rank, and whether or not it was "on" as a ranking factor, continued throughout the life of the Authorship program. Throughout that period, however, Cutts continued to refer to it as something Google might do in the future. (You can find my own take on why I believed Authorship was never used as a direct ranking factor here.) The first hint that Google might be drawing back from Authorship came at Pubcon Las Vegas in October 2013 when Matt Cutts, in his keynote "State of Search" address, revealed that at some point in the near future Google would be cutting back on the amount of Authorship rich snippets shown by "around 15%." Cutts said that in experiments, Google found that reducing Authorship rich snippets by that much "improved the quality of those results.” Sure enough, in early December of that year, Moz's Peter Meyers detected a rapid decline over several days in the number of Authorship rich snippets in search results, as measured by his Mozcast Features tool. Around that same time Google implemented what I called "two-class Authorship," a first class of authors who continued to get the full rich snippet, and a second class who now got only a byline (no author photo). Finally, in August 2014, this author was contacted directly by John Mueller, offering to share some information under an NDA embargo until the information was made public. In my call with Mueller, he told me that he was letting me know 24 hours in advance that Google Authorship was going to be discontinued. He added that he was making this call as a courtesy to me since I had become the primary non-Google source of information about Authorship. With that information, Eric Enge and I were able to compose an in-depth article on Authorship and its demise for Search Engine Land that went live within two minutes of John Mueller's own public announcement on Google+. In our article linked above, Eric and I give our takes on the reasons behind the death of Authorship and the possible future of author authority on Google. Will Google bring back Authorship? From the day Authorship was "killed" in August 2013, we heard no more about it from Google—until Gary Illyes's remarks at SMX East. So do Gary's remarks mean we should expect to see a return of Google Authorship to search results? I don't think so, at least not in any form similar to what we saw before. Let me explain why. 1. Illyes made no promise. Far too often people take statements about what Google "could" or "might" do from spokespersons like Gary Illyes, Matt Cutts, and John Mueller and translate "could/might" to "will." That is unfair to those spokespeople, and an abuse of what they are saying. Just because something is spoken of as a possibility, it does not follow that a promise is being made. 2. It ain't broke so.... So if there are no actual plans by Google to restore Google Authorship, why would Illyes make a point of stating publicly that authors and publishers should continue to use the rel=author tag? I think a primary reason may be that once Google gets any set of people to begin using any kind of schema, they'd rather have it remain in place. Anything that helps better organize the information on web pages is good for a search engine, whether or not that particular information is "in play" at present. In the case of rel=author, I think it still may be useful to Google to be able to have confidence about content connected with certain authors. When Authorship ended, many people asked me if I were going to remove the tags from my content. I responded why would I? Having them there doesn’t hurt anything. But more important, as an author trying to build my personal brand reputation online, why wouldn't I want to give Google every possible hint about the content with which I should be identified? 3. The reasons why Authorship was killed still remain. As with any change in Google search, we'll probably never know all the reasons behind it, but the public reasons stated by John Mueller centered around Google's commitment to a "mobile first" user experience strategy. Mobile first is a recognition that search is more and more a mobile experience. Recently, Google announced that more of all searches are now done on mobile than desktop. That trend will likely never reverse. In response, we've seen Google continually moving toward simpler, cleaner, less-cluttered design in all its products, including search. Even their recent logo redesign was motivated by the requirements of the small screen. According to Mueller, Authorship snippets were too much clutter for a mobile world, with not enough user benefit to justify their continuation. In our Search Engine Land article, Eric Enge and I speculated that another reason Google may have ended the Authorship experiment was relatively poor adoption of the tagging, low participation in Google+ (which was being used as the "anchor" on Google's side for author identification), and incorrect implementation of the tags by many who did try to use them. On the latter point, Enge conducted a study of major publishers, which showed that even among those who bothered to implement the authorship tagging, the majority was doing it wrong. That was true even among high-tech and SEO publications! Alt that points to a messy and lopsided signal, not the kind of signal a search engine wants. At the end of the day, Google couldn't guarantee that a result showing an Authorship rich snippet was really any better than the surrounding results, so why give it such a prominent highlight? Despite Gary Illyes saying that if more sites used rel=author Google might begin using it again, I don't see that doing so would change any of the conditions stated above. Therefore, I believe that any future use of rel=author by Google, if it ever occurs, will look nothing like the Authorship program we knew and loved. So is there any future for author authority in search? To this question, I answer a resounding "Yes!" Every indication I've had from Googlers, both publicly and privately, is that author authority continues to be of interest to them, even if they have no sound way to implement it yet. So how would Google go about assessing author identity and authority in a world where authors and publishers will never mass-tag everything accurately? The answer: the Knowledge Graph, entity search, and machine learning. The very first attempts at search engines were mostly human-curated. For example, the original Yahoo search was fed by a group of editors who attempted to classify every web page they came across. But as the World Wide Web took off and started growing exponentially, it was quickly obvious that such attempts couldn't scale. Hyperlinks between web pages as a means of assessing both the subject matter and relative authority of web pages proved to be a better solution. Search at the scale of the web was born. Remember that Google's actual mission statement is to "organize the world's information." Over time, Google realized that just knowing about web pages was not enough. The real world is organized by relationships between entities—persons, places, things, concepts—and Google needed a way to learn the relationships between those things, also at scale. The Knowledge Graph is the repository of what Google is learning, and machine learning is the engine that helps them do that learning at scale. At a simple level, search engine machine learning is the development of an algorithm that learns on its own as a result of feedback mechanisms. Google is applying this technology to the acquisition of and linking together of entities and their relationships at scale. It's my contention that this process will be the next evolutionary step that will eventually enable Google to identify authors who matter on a given topic with their actual content, evaluate the relative authority of that content in the perceptions of readers, and use that as a search ranking factor. In fact, Matt Cutts seemed to hint at a Knowledge Graph-based approach in a June 2013 video about the future of authorship where he talked about how Google was moving away from dependence on keywords, from “strings to things,” figuring out how to discover the “real-world people” behind web content and “their relationships” to improve search results. Notice that nothing in a machine learning process is dependent upon humans doing anything other than what they already do on the web. The project is already underway. Take a moment right now and ask Google, "Who is Mark Traphagen?" If you are in the US or most English-speaking countries, you'll probably see this at the top of the results: That's a Knowledge Panel result from Google's Knowledge Graph. It reveals a couple of things: 1. Google has a high confidence that I'm likely the droids, er, the "Mark Traphagen" you're looking for. There are a few other Mark Traphagens in the world who potentially show up in Google Search, but Google sees that the vast majority of searchers who search for "Mark Traphagen" are looking for a result about me. Thanks, everybody! 2. Google has high confidence that the Mark Traphagen you're looking for is the guy who writes for Search Engine Land, so that site's bio for me is likely a good instant answer to your lifelong quest to find the Real Mark Traphagen (a quest some compare to the search for the Holy Grail). If Google can continue to do that at scale, then they can lick a problem like assessing author authority for search rankings without any help from us, thank you very much. How does all this fit with Gary Illyes's recommendation? I think that while Google knows it ultimately has to depend on machine learning to carry off such projects at scale, any help we can give the machine along the way is appreciated. Back in the Google Authorship I days, some of us (myself included) believed that one of the real purposes for the Authorship project was to enlist our help in training the machine learning algorithm. It may be that rel=author is still useful for that. What might Authorship look like in the future? Allow me to speculate a bit. I don't expect we'll ever again see the mass implementation of author rich snippets we saw before, where almost anyone could get highlighted just for having used the tagging on their content and having a Google+ profile. As I stated above, I think Google saw that doing that was a non-useful skewing of the results, as more people were probably clicking on those rich snippets without necessarily getting a better piece of content on the other end. Instead, I would expect that Google would see the most value in identifying the top few authors for any given topic, and boosting them. This would be similar to their behavior with major brands in search. We often see major, well-known brands dominating the top results for commercial queries because user behavior data tells Google that's what people want to see. In a similar way, people might be happy to be led directly to authors they already know and trust. They really don't care about anyone else, no matter how dashing their profile image might be. Furthermore, for reasons also stated above, I don't expect that we'll see a return to the full rich snippets of the glory days of Authorship I. Instead, the boost to top authors might simply be algorithmic; that is, other factors being equal, their content would get a little ranking boost for queries where they are relevant to the topic and the searcher. It's also possible that such author's content could be featured in a highlighted box, similar to how we see local search results or Google News results now. But notice what I said above: "...when [the authors] are relevant to the topic and the searcher." That latter part is important, because I believe it is likely that personalization will come into play here as well. It makes sense that boosting or highlighting a particular author has the most value when my search behavior shows that author already has value to me. We already see this at work with Google+ posts in personalized (logged in) search. When I search for something that AJ Kohn has posted on Google+ while I'm logged in to my Google account, Google will elevate that result to my first page of results and even give it a good old-fashioned Authorship rich snippet! Google has high confidence that's a result I might want to see because AJ is in my circles, and my interactions with him and his content show that he is probably very relevant and useful to me. Good guess, Google, you're right! It is now obvious that Google knows they have to expand beyond Google+ in entity identification and assessment. If Google+ had taken off and become a real rival to Facebook, Google's job might have been a lot easier. But in the end, building machine learning algorithms that sniff out our “who's who” and “who matters to whom” may be an even better, if vastly more difficult, solution. So to sum up, I do expect that at some point in the future, author authority will become a factor in how Google assesses and ranks search results. However, I think that boost will be a "rich get richer" benefit for only the top, most reputable, most trusted authors in each topic. Finally, I think the output will be more subtle and personalized than we saw during the first attempt at Authorship in search. How to prepare for Authorship II Since it is unlikely that Authorship II, the future implementation of author identity and authority in search, will be anything like Authorship I, is there anything you can be doing to increase the odds that Authorship II will benefit you and your content? I think there are several things. 1. Set a goal of being the 10X content creator in your niche. Part of the Gospel According to Rand Fishkin these days is that "good, unique" content is not good enough anymore. In order to stand out and get the real benefits of content, you've got to be producing and publishing content that is ten times better than anything currently on page one of Google for your topic. That means it's time to sacrifice quantity (churning out posts like a blogging machine) for quality (publishing only that which kicks butt and makes readers stand up, take notice, and share, recommend and link). 2. Publishers need to become 10X publishers. If you run a publishing site that accepts user-generated content, you've got to raise your standards. Accepting any article from any writer just to fill space on your pages won't cut it. 3. Build and encourage your tribe. If you are authoring truly great, useful stuff, sooner or later you will start to attract some fans. Work hard to identify those fans, to draw them into a community around your work, and to reward and encourage them any way you can. Become insanely accessible to those people. They are the ones who will begin to transmit the signals that will say to Google, "This person matters!" 4. Work as hard offline as you do online. Maybe harder. More and more as I talk with other authors who have been working hard at building their personal brands and tribes, I'm hearing that their offline activities seem to be driving tremendous benefit that flows over into online. I'm talking about speaking at conferences and events, being available for interviews, being prominent in your participation in the organizations and communities around your topic, and dozens of other such opportunities. BONUS: Doing all four of those recommendations will reap rewards for you in the here and now, whether or not Google ever implements any kind of "author rank." The natural power of the fact that people trust other people long before they will trust faceless brands continues, in my opinion, to be one of the least understood and underutilized methodologies in online marketing. Those who work hard to build real author authority in their topic areas will reap the rewards as Google begins to seek them out in the days to come. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  13. Posted by SamuelScott It's October 26, 1985. The top song in the United States is Whitney Houston's "Saving All My Love for You." The top weekend movie at the US box office is "Jagged Edge," starring Glenn Close and Jeff Bridges. And Marty McFly was about to travel back in time in "Back to the Future." (photo: Back to the Future Day) Yes, for any Mozzers who do not know, Back to the Future Day—the exact date when Marty went to the future in the sequel to the original film—is today! Also in the mid-1980s, the Internet was largely confined to the US military and large higher-education institutions. Most marketing at the time, of course, occurred via print, magazine, TV, and radio mediums and channels in what is now often called "outbound marketing." Why, then, is this relevant in 2015? One problem in digital marketing is that many digital marketers do not have much education or experience in traditional marketing and communications. If you mention the 4 Ps or ask about the promotion mix at most SEO conferences, you'll probably receive blank stares in response. It's important to know what marketers did before the Internet because many of the strategies that had been developed and honed since the early twentieth century are still applicable today. So, to help the community, I wanted to give a high-level overview of traditional marketing and communications and then provide discussion topics for the comments below, as well as actionable tasks for readers to start integrating traditional marketing principles into their digital strategies. By the end of this post, you'll have a solid sense of the following: What is the difference between marketing and communications? What is the integrated discipline "marcom?" What are the 4 Ps? What parts comprise the promotion mix (within the 4 Ps)? When should I do outbound and inbound marketing? How do the Internet and digital marketing fit into all of this? What about SEO, social media marketing, content marketing, growth hacking, and linkbuilding? What actionable things can I do? Why did Marty McFly's mom and dad not recognize him from the 1950s when he had grown into a teenager in the 1980s? This post is related to an earlier Moz post of mine on the marketing department of the future—I will address the connection between the two essays below. I hope you're excited! Where we're going, we don't need roads—just a little bit of time. All "Back to the Future" photos are from the official site The basic points to remember Here are two of my favorite quotes in the sidebar of the blog of advertising industry veteran Bob Hoffman: Marketing does not really change that much because people do not change—and anyone who says differently is selling something. Countless gurus, writers, and keynote speakers have proclaimed that "inbound marketing is the future," "social media has changed everything (a parody of TED talks)," and "advertising is dead." But some examples from Hoffman prove otherwise: Traditional live TV viewing is not "dying" at all, and most people are not using DVRs to skip TV ads. Traditional "outbound" TV advertising is often still important enough that in one example, Pepsi lost a lost of money and dropped to third place in terms of market share when the company moved its entire ad spend from television ads to social media. The point is that no marketing strategy or tactic is always best for every purpose, product, brand, or industry. Sometimes TV advertising should be a part of the promotion mix; sometimes not. Sometimes content marketing is the best way to go; sometimes not. Sometimes modern "inbound" methods deliver the greatest value; sometimes it's traditional "outbound" ones. More on that below. Here, I wanted to take us back—back to the past to show the future where we have been, why it matters, and how we should incorporate it into digital marketing. A full list of resources is provided at the end for those who wish to learn more. The marcom workflow This flowchart is a high-level overview of the step-by-step process that I will describe in detail below. For those who want a quick summary, here is the workflow that marcom executives typically use: List all of the external audiences with whom your company communicates Remember that current and potential customers are just one of your audiences Determine the specific messages that you want to communicate to each "public" based on your audience research and buyer personas For customers: Establish product-market fit Decide on a pricing strategy Choose how you will prioritize direct marketing, personal selling, advertising, sales promotion, and publicity in your promotion mix in general and then allocate your resources accordingly Determine which online and offline communications channels you will prioritize within your promotion mix and then allocate your resources accordingly Create the needed creatives, sales collateral, and marketing content Transmit the marketing materials over your selected channels to your audience Measure and audit the results Return to steps 3–8 as necessary and adjust to attempt to maximize revenue, sales, ROI, or any other metric based on your business and marketing goals Marketing vs. communications Traditionally, marketing and communications had been entirely different functions that each had their own departments. Marketing focused on issues such as customers, sales, and brand awareness. Communications (often called public relations or external relations) dealt with everyone else in the outside world with whom the company interacted, such as the government, community, media, and financial analysts. In other words, communications (another word for PR) as a whole is the act of communicating with any relevant external group of people. Obviously, companies would usually not want to say the exact same thing to customers, influencers, the media, the government, and the local community. Today, however, more and more companies are combining Marketing and Communications under a single department (often called "marketing communications" or "marcom") to become more efficient and ensure that all messages are consistent among all audiences and across all channels. The key to understand: Publish and transmit unified, integrated messaging on and across all online and offline marcom channels including websites, social networks, advertising campaigns, online content, news releases, product brochures, and sales catalogues. SEO pro tip: When relevant, include your desired keywords—using natural language rather than keyword stuffing, of course—in your messaging everywhere (for the co-occurrence benefits; websites seem to rank more highly for search terms when links to those sites appear on pages that also mention those terms!). Remember: Communication is one person speaking with another. Marketing is one type of communication. The 4 Ps in traditional marketing In a unified marcom strategy, all of a company's audiences need to be included strategically. However, because Moz's readers focus mainly on marketing rather than PR, I will focus the rest of this essay on "customer relations" (or "marketing") specifically. After creating the overall marketing messages that a company will communicate to current and potential customers, the next step in traditional marketing theory is to focus on the 4 Ps: product, price, place, and promotion. (Note: This process is not set in stone. Sometimes the 4 Ps will be determined before the overall marketing messaging is decided.) Product Not a bad product! Product strategy, according to Study.com, is essentially maximizing product-market fit—the degree to which a product or service satisfies the demand for it. In the modern, inbound world (and most specifically in terms of SaaS products), product strategy has been rebranded as "growth hacking." For those who are interested, Ryan Holiday goes into more detail in "Growth Hacker Marketing." The process generally follows these steps: 1. Create a Minimum Viable Product (MVP), traditionally called a prototype. 2. Get a core set of early testers/users to establish product-market fit. Show them the MVP and quiz them on what they like, what they don't like, what would be more useful, and what is not necessary. Revise the product and get more feedback, revise the product and get more feedback to have a product that is constantly improving. 3. Incorporate sharing and growth naturally in the product. Push new users to refer friends for a discount. Put social sharing buttons inside the user dashboard. 4. Experiment with the different ways to get "traction" to see what gets the most users quickly at the lowest cost. It could be advertising, organic traffic, media coverage, or any other potential marketing strategy. Once you have established product-market fit, how will you communicate the value that the product provides on all online and offline marcom channels, including websites, social outlets, advertising campaigns, online content, news releases, product brochures, and sales catalogues? Price Thankfully, this ended up not happening. Pricing strategy is not only a purview of the finance department—it is also a part of marketing. If, for example, a company wants to earn $100 in revenue, it can sell one "widget" for $100 or 100 "widgets" for $1 each. Each tactic requires a different marketing strategy. A high price communicates high value and rarity (think about the high prices of diamonds, which are essentially shiny, useless rocks marketed to rich people) while a low price shows affordability and availability (think about Walmart and its value-based marketing to the less-rich Everyman). In a marketing context, the pricing strategy will also need to take into account items such as the size of the market, the cost to produce the product or service, the level of competition, the economics of the target market, and whether the company wants to market itself based on quality or value (or perhaps both). Once you have decided on your pricing strategy, how will you incorporate that element into online and offline channels, including websites, social outlets, advertising campaigns, online content, news releases, product brochures, and sales catalogues? Place Distribution (how Place is now known) is the decision on how to transmit a product to customers. It might be setting up a lemonade stand on the corner, using franchisees, or mass producing widgets and then selling them to intermediaries, who then resell them. In the Internet Age, when products and services can distributed, purchased, and consumed or used almost immediately from anywhere, this traditional part of the 4 Ps is becoming less important. Promotion One way to promote yourself is to get in the mass media. Promotion is "raising customer awareness of a product or brand, generating sales, and creating brand loyalty." Promotion is also the most complex part of the 4 Ps because of the decisions that marketers make when deciding which methods are the best ways to achieve those three goals. The promotion mix has always consisted of five elements: direct marketing, sales promotion, personal selling, advertising, and publicity. Based on the product, industry, goal, and target audience, each element is given a different weight and priority both online and offline. The promotion mix For a second, forget about "SEO," "content marketing," "social media marketing," and the other terms that are frequently used among digital marketers. Here are the five elements of the traditional promotion mix described in specific detail. I will use specific definitions from my old MBA marketing textbook ("Principles of Marketing" by Philip T. Kotler and Gary Armstrong) because, as I will explain below, how we define our terms greatly affects our marketing success. Direct marketing Direct marketing is "direct connections with carefully targeted individual consumers to both obtain an immediate response and cultivate lasting customer relationships." Direct marketing "includes catalogs, telephone marketing, kiosks, the Internet, mobile marketing, and more. " Advertising Advertising is "any paid form of non-personal presentation and promotion of ideas, goods, or services by an identified sponsor." Advertising "includes broadcast, print, Internet, outdoor, and other forms." Personal selling Personal selling is "personal presentation by the firm’s salesforce for the purpose of making sales and building customer relationships." Personal selling "includes sales presentations, trade shows, and incentive programs." Sales Promotion Sales promotion is "short-term incentives to encourage the purchase or sale of a product or service." Sales promotion "includes discounts, coupons, displays, and demonstrations." Publicity Publicity is "gaining public visibility or awareness for a product, service, or your company via the media." To learn more about publicity, read "The Father of Spin," a biography on the person who essentially invented the practice in the early-twentieth century. As described in Amazon's description, he did "publicity campaigns for American Tobacco, Ivory Soap, United Fruit, book publishers, manufacturers of eggs and bacon, and the platforms of presidents from Coolidge to Eisenhower." (Marketing can be done for good or evil ends—I will leave the choice up to the readers.) For a modern example, see this Kickstarter campaign to build a miniature version of the hoverboard that was shown in "Back to the Future II": Perhaps inspired by the Kickstarter, Lexus is also releasing its own hoverboard and getting a lot of publicity as a result. How much brand awareness and how many links and social shares and followings do you think they have gained as by-products? Creating the best promotion mix The chart above presents an overview of the uses, benefits, and drawbacks of each element of the promotion mix. Answering some of the questions below can help to determine which of the following are relevant to one's marketing goals and show companies to allocate resources to each element appropriately. I need to introduce a new product to a new market (advertising) I have a product that’s under attack by competitor’s products, and I need to retain my current customer base (sales promotion) I have a product that is highly specialized, technical, or expensive (personal selling) I need to correct false impressions or counter false claims made about my product (publicity) I need to create greater brand awareness of my product (advertising) I need to communicate new features to increase consumption by present customers (direct marketing) I have a product with a long sales cycle (personal selling) I need to generate more "buzz" or word-of-mouth business (sales promotion or publicity) I need to build a new image and reposition my product (advertising) Companies will generally allocate different weights to each part of the promotion mix based on how they answer these types of questions. Here is an example from the Edward Lowe Foundation, a US non-profit organization that helps to encourage local entrepreneurship: Whichever elements you choose, how will you incorporate them into your desired online and offline channels, including websites, social outlets, advertising campaigns, online content, news releases, product brochures, and sales catalogues? Don't get distracted by buzzwords Now, I have been discussing external communications, the 4 Ps, and the promotion mix—all of which have been used by traditional marketers since before the Internet had even existed. But why is this important to digital marketers today? In my earlier Moz essay on the integration of PR and SEO (from when I was working for an agency), I explained the traditional communications process in this manner: Here's what I had written at that time: A sender decides upon a message; the message is packaged into a piece of content; the content is transmitted via a desired channel; and the channel delivers the content to the receiver. Marketing is essentially sending a message that is packaged into a piece of content to a receiver via a channel. The rest is just details. The Internet is just a new set of communications channels over which marketing promotional mixes are executed. As Kotler and Armstrong note in my old textbook (emphasis added): As noted earlier, online marketing is the fastest-growing form of direct marketing. Widespread use of the Internet is having a dramatic impact on both buyers and the marketers who serve them. In this section, we examine how marketing strategy and practice are changing to take advantage of today’s Internet technologies. "Digital marketing" is really just doing direct marketing, sales promotion, personal selling, advertising, or publicity via a specific collection of communications channels that we call the Internet. The myth of "social media marketing" I think I see Friendster and Orkut in there. There is no such thing as "social media marketing" as a thing unto itself. (Please notice that I put that phrase as a whole in quotes.) Take the Lexus Back to the Future hoverboard's Facebook page. All of the posts that are gaining likes, comments, and shares are not examples of "social media marketing"; by definition, it is "publicity" (via an online channel) because it is "gaining public visibility or awareness for a product, service, or your company via the media." If I export a list of people who mention "widgets" on Twitter and then tweet to each person to sell them widgets, that is not "social media marketing." By definition, it is doing direct marketing (via an online channel) because I am establishing "direct connections with carefully-targeted individual consumers to both obtain an immediate response and cultivate lasting customer relationships." If I respond to customers who are asking questions on Facebook or Twitter about how my company's product works, then I am not doing "social media marketing." Obviously, I'm doing customer support. "Social media" is not a marketing strategy. Social media, just like the telephone, is a communications channel over which marketing, PR, customer support, and more can all be performed. After all, there's no such thing as a "telephone strategy." Why words matter As a former journalist, I take pride in being very precise with language because it is always important to communicate ideas 100% accurately, fairly, and objectively. Within the articles that I contribute to the digital marketing community (and sometimes in the comments on others), I often discuss the definitions of terms because being precise with our language is the best way to help all of us do our jobs better. If you want to integrate traditional and digital marketing—or, to be more accurate, if you want to market over digital channels—here are some examples of what to do: Stop studying "social media marketing." Start studying the best practices in "direct marketing," "customer support," or any other desired activity and then apply those ideas whenever you use social media channels. In the coming years, publicists, customer support representatives, and others will naturally incorporate social media into their existing functions. "Social media" is not going to be a job unto itself. Stop studying "linkbuilding" and "doing content marketing to earn links." Start studying the best practices in "publicity" because 99% of natural, quality, and authoritative links come as natural by-products of getting the media, bloggers, and people in general to talk about you online. The same principles of publicity apply regardless of whether I am using the channel of the telephone, e-mail, or social media when communicating with the media. Start studying the best practices in personal selling and sales promotion, and apply those principles whenever you do personal selling or sales promotion over digital channels. Selecting the channels for your promotion mix What channels would the makers of "Back to the Future" use to market the film in 1985 compared to 2015? After you have decided upon your communications messages, determined the 4 Ps, and selected your promotion mix, only then is it time to select your channels by answering these questions: Can our audience be best reached online or offline (or some degree of both)? Based on the answer to the first question, which channels within each category should we use? (For example, the offline channels of TV, radio, newspapers, or magazines, or the online channels of advertising networks, social media, or online communities?) The rule of thumb is to "go where the target audience lies"—whether it's online or offline or both—but it's more complicated than that. Channels themselves have their positives and negatives. Here are some examples based on executing the promotion mix online or offline: Advertising: The results and ROI of traditional offline advertisements are difficult to track precisely, but people generally pay more attention to them. Online advertising is easier to track, but the industry is rife with alleged fraud (a Moz essay of mine), and more and more people are using ad blockers. Direct marketing: Is it best to grow an e-mail list, to run searches on Twitter to isolate groups of people who are interested in what you offer, or to send a sales catalogue and track who purchases products? The answer will be different for everyone based on the audience. Personal selling: If you're selling, say, diamonds or expensive enterprise software, fewer people might buy following a Pinterest campaign compared to using the telephone or even meeting someone in person. Sales promotion: Offering quick discounts needs to convey a sense of urgency, so it's important to use channels that will reach audiences immediately. Therefore, it makes more sense today to use mobile marketing over snail-mail, for example. Publicity: Traditional and digital PR are increasingly two separate entities. As I explained in a Moz post on PR 101 for digital marketers, publicity in the past has focused on writing pitches, creating media lists, and pitching reporters and bloggers on a story. However, gaining attention for a company today may require using Facebook, Twitter, and other social media networks to ensure mass exposure to a creative campaign. Now, I understand that Moz's audience is focused almost entirely on digital. But digital is not always the answer. Take this question from Hoffman, the retired ad agency CEO: First, I want you to think about your refrigerator. Think about all the stuff that's in there: The cheese, the eggs, the juice, the jelly, the butter, the beer, the mayonnaise, the bacon, the mustard, the frozen chicken strips... Now think about your pantry. The cereals, the beans, the napkins, the flour, the detergent, the sugar, the rice, the bleach... Now answer these questions: Do you "share branded content" about any of this stuff? Do you feel "personally engaged" with these brands? Do you "join conversations" online about this crap? Do you ever "co-create" with any of these brands? Do you feel like you are part of these brands' "communities?" Now answer me this: If you don't, why in the f------ world do you believe anyone else does? The specific product and industry is important to keep in mind while selecting channels. Big consumer brands usually benefit the most from traditional channels. After all, Pepsi lost a lot of money when it moved all its ad spend from TV to social media. SaaS products, in one example, might be completely different. Again, the key is to test to see what works. The rest of the marketing process The 4 Ps, promotion mix, buyer personas, and channel research are only the strategic first-half of the marketing process. As I explained in my earlier essay on the marketing department of the future, the rest of the process consists of creating online and offline marketing collateral and content, transmitting it to the audience, and then auditing the results. Examples of creatives Direct marketing: producing sales collateral to give to prospects directly Advertising: creating online and offline ads Personal selling: designing presentations and webinars Sales promotion: creating coupons, landing pages, and more Publicity: writing online and offline by-lined articles or capturing a publicity stunt on video Transmission & audit The next step is to transmit the creatives to the audience over the selected online or offline channels. Once a full marketing campaign has been executed, it's time to audit the results. A company might find, for example, that a combination of offline advertising, online direct marketing, and a combination of online and offline publicity works the best. Or it could be something else entirely. The only way to know is to test. A hypothetical example Here are four examples—one hypothetical and three real ones—of the different ways that traditional and digital marketing strategies can be integrated. One comment I left a few months ago on another Moz essay on creating demand for products was this: Moz sells, in part, "SEO software." Say SEOmoz (as it was called in the beginning) had been launched in 1995. There would be little keyword volume for "SEO software" for this reason: No one knows that "SEO" exists in the first place, so there would obviously be no demand for "SEO software" specifically. Moz would probably have had only one customer—Danny Sullivan. :) So, in such a scenario, if I were to sell SEO software in 1995, I would first do a PR and advertising campaign to generate awareness of SEO in general and SEO software specifically. I'd bet that search volumes would increase in due time. Then, once people know that both things exist, then you can start to capture prospects and move them down the funnel via inbound marketing. "Outbound marketing" and "inbound marketing" will always be needed because outbound marketing creates demand while inbound marketing fulfills demand. In my opinion, studies that purport to show that inbound marketing is always better than outbound marketing only tell part of the story. If traditional, offline advertising is so ineffective, they why do I still see thousands of ads everywhere I go every day? Three real-world examples An American pizza restaurant As I once explained in a BrightonSEO talk (see a summary with slides on my website), a small pizza parlor in Philadelphia got the best results—a lot of brand awareness, hundreds of high-quality links, and thousands of Facebook "likes"—by thinking not about "SEO" or "social media marketing" or "content marketing," but rather good, old-fashioned "publicity" within the promotion mix. The local business—whether intentionally or not—got a lot of local news coverage by allowing people to "pay it forward" by purchasing slices to give to people in need. The news coverage snowballed into national coverage in the United States and the business owner appearing on the talk show "Ellen." The pizzeria did not produce one piece of "content," or even have a blog at all. For most small, local businesses (especially restaurants), I'd invest in a good, creative publicist over "content marketing" any day. Still, it's crucial that one's marketing toolkit contain every potential strategy and tactic because different promotion mixes work best for different companies and industries. Pizza Hut Israel Yes, I've got a soft spot for pizza. While I was writing this post here in Tel Aviv, I received this e-mail and saw this Facebook post: (That's Pizza Hut Israel selling a pizza with a crust made of bite-sized pieces filled with cream cheese.) Now, what is this? It's not "e-mail marketing" or "social media marketing." Pizza Hut here had decided to do a sales promotion via the channels of e-mail and Facebook. It was the company deciding upon a certain promotion mix (likely for the reasons described above) and then choosing to execute that promotion over the channels of e-mail and social media. Logz.io Logz.io cofounders CEO Tomer Levy (left) and VP Product Asaf Yigal (right) at AWS re:Invent 2015 In another example, we at Logz.io offer predictive error detection in our ELK-as-a-service cloud platform for DevOps engineers, and we have found that the best results for us come from a combination of personal selling and publicity. The personal selling is when we sponsor and speak at conferences for DevOps engineers and system administrators; the publicity is when we publish and then publicize informational articles and guides in major publications and on our website. (For example, our CEO, Tomer Levy, wrote about different open-source DevOps tools on DevOps.com, we've published a guide on how to deploy the ELK Stack, and I discussed how to use server log analysis for technical SEO here on Moz.) Again, every company and industry is different. It's important to test every possible promotion mix to see what delivers the greatest ROI for you. Within the marketing industry, there are many self-interested parties that advise one promotion mix or another. The best thing to do is to test everything and see for yourself. I hope these three examples will help to get you thinking. Now, where's SEO? I've discussed my opinions at length in the other essays of mine to which I've linked here, so I will just summarize here so I will not always repeat myself: "SEO" is now technical and on-page optimization. The more that one understands traditional marketing, the more that one sees that (good) "off-page SEO" tactics are really just doing good traditional marketing. Almost any "off-page SEO" tactic that anyone can name is simply direct marketing, sales promotion, personal selling, advertising, or publicity by another name. The reason it is important to understand this concept is that, as Google's algorithm becomes smarter and thinks more and more like a human being, it's becoming imperative to think more and more about building a brand among people over the long term rather than chasing an algorithm and directly trying to get high rankings in the short term. To help the community, my goal here is for SEOs to stop thinking so much about SEO specifically and to think more about marketing. After doing all of the needed technical and on-page SEO, the best results—higher rankings, greater backlinks, and more engagement—will come simply as by-products of building real brands that have a lot of authority and engagement. Today, this is what will happen if you try to manipulate, outsmart, or otherwise chase Google's algorithm: Summary If you've read this far, I'm glad that you found my thoughts to be interesting! I'll leave you with this final idea: Complete List of Resources: Principles of Marketing by Philip T. Kotler and Gary Armstrong (textbook) (pro tip: buy a prior edition for a lot cheaper than the new edition) Public Relations: Strategies and Tactics by Dennis L. Wilcox and Glen T. Cameron (textbook) (ditto) The 4 Ps of the marketing mix—Product, Price, Place (now called Distribution), and Promotion "Growth Hacker Marketing" (the integration of product and marketing) by Ryan Holiday The promotion mix—advertising, personal selling, sales promotion, publicity, and direct marketing Creating a promotion mix: The Edward Lowe Foundation and the Chartered Institute of Marketing of the U.K. (PDF file) Publicity: See my prior Moz essays on An Introduction to PR and The Coming Integration of PR & SEO as well as my Mozinar on the latter topic. In addition, read "The Father of Spin" The Marketing Department of the Future (another Moz essay of mine) Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  14. Posted by EricaMcGillivray I love working with all kinds of speakers at Moz, whether for big shows like MozCon or our biweekly webinars, Mozinars. I also get out there and speak myself. Many people ask me how to become a speaker in our industry or if they can speak at one of Moz's events. The truth is, speaking is hard. And putting yourself out there is awesome. So, you're ready to take a step toward being onstage. What should you be doing? Have a speaking goalA speaking goal will keep you focused on what you want to get out of speaking. Goals may vary event to event, or encompass both short-term and long-term dreams. And yes, they may change over time. Here are some goals either I or speakers I've worked with have had: Conquer my fear of public speaking. Share my incredible new idea with a crowd of like-minded people. Share my field with an adjacent audience. Show my expertise in a field. Get new clients or a new job. Get other speaking gigs based on how well I do. Speak on the MozCon stage. Learn how to deliver a dynamic presentation the way speakers like Rand Fishkin and Wil Reynolds do. Speak in front of a crowd of more than 1,000 people. What's your goal? Come up with pitches!Keep a document of ideas that you'd like to speak (or write) about. Don't wait until you see the announcement that a conference is now accepting pitches or until you receive outreach about speaking, as you'll probably suffer from idea block. Research the conference you want to speak at. Figure out who its audience is. Look at past topics. If possible, attend the conference before you toss your hat in to fully understand what it's like. Make sure you're the right speaker. Some conferences have requirements, such as being a sponsor, having a certain title (VP, Director, CEO), making sure speakers fit a code of conduct, preferring actionable talks to inspirational ones, etc. Great pitches clearly communicate your topic to the people throwing the conference. Sadly, many pitches come in as a teaser written for an audience to get them to attend your session. A conference runner and selection committee need to know the actual meat of your presentation. They want to make sure the topic's details are ethical, match their audience needs, meet knowledge level requirements, and more. Think of how different a link building session at Blackhat World would be compared to SMX. Stay informed on when pitches go live for the conference. This means you can be prepared to submit your idea immediately—not worrying about the deadline—and will ensure you don't miss it. For example, the pitches for MozCon community speakers always go live three months prior to the conference date. For the upcoming MozCon in 2016, they'll be on our blog in June. Build your speaking portfolio I can't stress the importance of having a speaking portfolio enough, especially if you're interested in talking at conferences with closed selection committees, such as MozCon. Speaking portfolios show off your hard work and put actual, concrete examples in the hands of event organizers. It will also set you apart from others. A search for "marketing speaker" on LinkedIn gives 43,000 results: For "SEO speaker" on Google, 8.9 million results are returned, and Scott Wilson dominates the knowledge box: What should you put into your portfolio?1. A decent, professional headshot For any conference you're speaking at, you'll need to send in a headshot. You'll want to make sure yours looks good both on your portfolio and in comparison to your fellow speakers. Be prepared when you're selected as a speaker. Don't be the one who sends in a headshot taken at a party with someone else obviously cut out of it or from when you last renewed your passport. There are plenty of professional photographers who will take headshots for you. Make sure you get both the rights to use them and the high-resolution version. If you can't afford one, check out Kick Point's guide to taking a professional headshot with your phone. While you want to present your best face, make sure the photos actually look like you. It's okay to photoshop a pimple, but own and love your wrinkles, big ears, or whatever else you're worried about. Some speakers also might add a memorable touch to their photos, which they then bring to the stage. For instance, Ruth Burr Reedy's headshot features a green blazer that she often wears onstage when speaking. You will probably want to get a new headshot at least every other year. The reality is that we age, changing both our personal styles and our looks. The worst comment I've ever received was someone asking me the age of my headshot because I "looked so much younger" in it. It was only two years old, but I'd changed my hairstyle, which made me look older. (Also, when remarking on someone's headshot, don't make sexist comments like these.) 2. Have a speaking bio ready Another general request from conferences will be to send in a bio about yourself. You want to keep it short and relevant for the audience you're speaking in front of. No one wants to read a bio that's longer than your topic pitch. Here are some examples of my own bios: Longer with a broader audience: Erica McGillivray is a die-hard geek who spends a ridiculous amount of time being nerdy, both professionally and personally. At Moz, she's the senior community manager and wrangles a community of over 500,000 members, co-runs the annual MozCon, and works on whatever else is thrown her way. She's also a founder of GeekGirlCon, a nonprofit run by volunteers that celebrates and supports geeky women with events and conventions. In her spare time, Erica's a published author and has a comic book collection that's an earthquake hazard. Shorter with marketing-focus—Erica McGillivray spends a ridiculous amount of time being geeky, both professionally and personally. At Moz, she's the senior community manager, wrangling 500,000+ people and co-running their annual conference MozCon. Erica also is a founder of GeekGirlCon, is a published author, and has a comic book collection that's an earthquake hazard. Follow her at @emcgillivray. Shorter with pop culture-focus—Erica McGillivray spends a ridiculous amount of time being nerdy, both professionally and personally. She's a senior community manager and wrangles over 500,000 community members for a local startup. Erica's also a founder of GeekGirlCon, a published author, and has a comic book collection that's an earthquake hazard. 3. Share your slide decks SlideShare makes sharing your decks 100% super easy. While some conferences will share your decks, you don't want to make your decks hard to track down. You want results like Rand's when your name is Googled with the words "slide deck": If you don't want to use SlideShare, there are other services out there. Or you can just upload it to your own site. Make sure you use a PDF version of your slide deck for the upload on whatever service you use; otherwise, your typography will look terrible on other people's computers who don't have those fonts installed. Example slide decks show off how well you can build knowledge into a deck. It shows your style, and it can also show how you've grown as a speaker. You can say that you always present "actionable tips," but a deck speaks to what you really do. What if all your decks are proprietary or unshareable to the public? It's time for you to create a deck for your portfolio. Maybe later you'll present it at a conference. Or maybe it's just a piece telling the world that you can indeed create a great deck. What if all your decks are more interesting when presented? I definitely subscribe to having less words on the screen, which can mean that presentations become almost meaningless without the audio. Ian Lurie does a great job at adding text—in an obvious way—to slides that make no sense without his voice: This is extra work, but can really boost you as an expert. Not to mention that your audience will love you for giving them access to your deck later. 4. Get a recording of you presenting Nothing says more about your qualification as a speaker than a recording of you presenting. It shows off your style, your confidence, and your radness. However, there can be lots of challenges around getting a recording. Many conferences in our space which have great speakers, like Pubcon, SMX, and State of Search, don't record sessions or most sessions. And other conferences, like MozCon and SearchLove, do record conferences, but sell the videos so they're private. How do you get a recording? A. Do it yourself. Record one of the presentations you've already planned on giving (or maybe that sample slide deck you built). Even if it's just you and the camera, it's better than nothing. One of my own speaking recordings is me practicing a talk in front of a handful of coworkers. B. Ask if your recording can be shared privately. In the case of MozCon, speakers have asked and then used their videos to privately show conference runners their work. This is a great option when you're pitching, but isn't ideal when you're setting up a page to show off your good work. 5. Put it all together on a webpage Since most of us haven't done so much speaking that we're easily Googled to find decks and videos, like Rand Fishkin, putting all your information on one page is paramount. Plus, it makes all your assets easy to link. Chris Brogan uses his LinkedIn page (plus how to contact him): Erika Napoletano's site makes it easy for you to understand her style and requirements for a speaking gig: Kerry Bodine's site displays her videos and tells you which events she's spoken at and will speak at: Ask the conference organizer questionsOnce you're in, you want to be prepared for show day. Unfortunately, a lot of conferences don't give you all the information. Here are five standard questions I ask conference runners when I'm speaking in order to be fully prepared, although you may have other needs: 1. What are the show's hours? What time is my speaking slot? Are there any special events for speakers to attend (parties/networking, speaker-only gatherings, etc)? You'll want to know this information as you book your travel. You'll want to make sure you're on time for your talk, not completely jetlagged (if crossing time zones), and build in opportunities to meet your speaking goals. 2. How many people are attending this conference? Can you share some demographics about who your audience is? You want to be prepared for both the audience size and their specialty. If you're in front of a group of 20, you can easily do interactive elements in a way you cannot in a room of 1,000+ people. Likewise, you want to tailor your talk to the audience with examples and knowledge levels that they'll relate to. When Dr. Pete Meyers spoke at SMX Sydney, he Australiafied his slide deck: 3. Do you use fullscreen 4:3 format or widescreen 16:9 for presentations? No one wants to create an entire slide deck and then find out it's in the other format. Let me tell you from experience, changing the formatting in PowerPoint or Google Docs stretches or squashes your images in horrifying ways. Also, as a speaker, you should know what the differences between these formats are and how to properly set up your slide deck software for each format. 4. What sort of setup is the stage? Podium or no podium? Wireless mic with a battery pack, handheld wireless mic, or wired standing mic? Will the projection happen from my own computer or your A/V system? All these questions ensure that you're prepared with the right equipment and that you dress appropriately. You don't want to run to the Apple Store at the last second when there are no proper cables for your Mac laptop. And if you're wearing a wireless microphone, you want to make sure there's a place to put the powerpack—like a pocket or belt—which, if you're wearing a dress, may not be part of your outfit. 5. Is there a due date for slide decks? Getting your deck and anything else you've agreed to provide a conference with on time is paramount for your own time management and making the conference runners happy. Conference organizers prefer smooth working processes, as there's already enough that can go wrong with live events. The last conference I spoke at, the deck was due during my summer vacation. I made sure my deck was done ahead of my time off because I didn't want to spend my holiday creating it. Practice, practice, practice Nothing makes your presentation better than practicing the talk. Try to practice in front of people so you know if your jokes land or when you need to pause to resonate points. (My cat never laughs at my jokes!) By the time you've gone through your talk five to ten times, you'll have it down much better. It will be more natural, hopefully without the stumbles and other pitfalls that occur with not having your points down. We all have parts we're great at and others we're not. That's okay, but let's work on them. Have fun! Never forget to have fun when you're preparing to speak. Whether you're deciding to pitch your first event or you're a seasoned speaker, you can rock it! Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  15. Tworząc nowe zadania (wątki) za pomocą TPL, możemy przekazać parametry AttachedToParent lub DenyChildAttach. Określają one, czy wątek powinien być podłączony do rodzica czy nie. W dzisiejszym wpisie postaram wyjaśnić się, czym one różnią się. Parametry definiują relację wątku z nadrzędnym wątkiem. Jeśli wątek A, tworzy kolejny wątek B, wtedy za pomocą powyższych wartości możemy określić […]Wyświetl pełny artykuł
  16. Posted by EricEnge When you first start in content marketing, you usually have little to no audience of your own for your content. If you're a major brand, you may be able to develop this quickly, but it's still extremely helpful to get visibility on third-party sites to grow your reputation and visibility as a producer of fantastic content, and to also net links to your site. This can come in the form of third parties linking to content on your site, or getting guest posting or columnist opportunities on those sites. A key stage in that process is creating a pitch to the site in question, in order to get them to say "yes" to whatever it is you're requesting. The hardest part of writing any pitch isn't the creation of the pitch itself. It's the legwork you have to do in advance. Successful pitches are all about preparation, and frankly, there needs to be a lot of it. To illustrate this, I'm going to walk through the process using a fictitious landscaping business, describing what they might need to do to start successful pitching of the content they plan to create. Step 1: Competitive research/identify topic areasYou aren't ready to pitch until you understand what else is out there. You need to visit major sites and see what they're writing about landscaping and related topics. You also need to see what your competitors are doing in terms of content marketing. If your competitor has been proactively doing content marketing for two years, it's a good idea to see what areas they've been focusing on. For example, if the competitor has already established themselves as the thought leader in Do-It-Yourself (DIY) landscaping, perhaps your initial focus should be on something else. Perhaps you can concentrate on specialty areas, such as prepping your yard for a wedding reception, a graduation party, or the integration of an in-ground pool into the yard. I'd start by pulling raw data from tools such as Open Site Explorer, and getting the Domain Authority data on the links they have. I did this for one landscaping business, and here's a snapshot of the highest-authority links they have: For this company, it would be interesting to see what they're doing with ThisOldHouse.com. That looks like a key relationship for them, as they've received 131 links from that site. Ultimately, what you would do next is dig into the details of each of these sites, find out why the competitor got the links, and uncover what it tells you about your opportunities. Step 2: Identify target sitesWho covers topic areas similar to yours? Have they published third-party contributions before? You can obtain some of this from the competitive research you went through in Step 1. But, to take it further, I did a search on "landscaping ideas": This brought up a bunch of high-authority sites to check out. As a next step, I collected data on their Domain Authorities, and then dug into whether or not they accepted guest posts. The search query I used to get information on whether a site accepts guest posts looks something like this: After doing that, we can assemble the data into a table that looks like this: This now helps you understand who to potentially pitch. Important note: Don't limit yourself to guest posts. With very high-authority sites like many of these, you may want to explore becoming a columnist. Pitching a column may be even easier than pitching a guest post, as it suggests that you are interested in a long-term relationship, which may be of greater interest to the target site. In addition, explore whether or not the sites in question do interviews of experts on different topics. This could be another way to get your foot in the door. Step 3: Line up your expertsHaving a legitimate expert writing for you is a crucial part of any pitch. Successful off-site content marketing requires you to get placement on some of the top sites covering your market. You won't succeed at this unless you have someone creating content for you that really knows their stuff. It's great if the subject matter expert (SME) is you, or someone working for you. This makes pitching your expertise easier. However, if no one inside your business has the time, you can rent (contract) the expertise. Either way, make sure your author is a legit SME. If you need to rent your SME, there are many ways to go about identifying someone. Here are some potential approaches to use: Search the sites you identified that accept guest posts, and find out who is writing them. A query such as: "guest post" site:bhg.com is pretty effective for this. Search related hashtags on Twitter, such as #landscaping and #gardening, to see who's sharing related content. Try other Google search queries, such as "landscaping design articles" or "landscaping books," and identify the authors. Search Amazon directly for landscaping and gardening books. You get the idea. Once you have identified a bunch of people, you have to start figuring out who might be a potential author for you. Keep in mind that you'll need to pay them to write on your behalf, and you'll have to help them line up places to write, as well. You don't need the absolute top name in the market, but you want someone who can credibly write unique and valuable content for you (you want what Rand calls 10x Content). Step 4: Identify the target topicOnce you have your writing team identified, work out with them what types of content they can help you create that meets these three goals: Fits your competitive strategy per Step 1. Might be of interest to your target sites. Matches up with what your SME can write. The topics you pitch need to be different for each site. Let's say we've decided on BHG.com as one of our sites of interest. As a first step, you can try searching the query "site:bhg.com landscaping" (quotes not required): This does not yet solve the problem for us, as it shows over 6,000 results. The good news is that this site covers the topic a lot; however, you're looking to see what gaps there may be in their coverage, and then see if you can pick something that will be supplemental to what they already have published. Since 6,000+ posts is a lot to look at, let's see if we can simplify it a bit more. Here's a follow-up search: This idea assumes you're able to create content around the topic of landscaping for colonial homes. Assuming you are, you can go through this and start trying to figure out what type of content you can create that the site hasn't seen before. This is an essential part of the process. Your goal is to come up with a topic that comes across to the editor you pitch as offering unique to value to their site. This is what the first four steps have been about. Don't go past this step until you have the first four steps nailed. Step 5: Research the people you will pitchWe're getting close to pitch time, but we have one more research step left. First, figure out who it is at the target site that you are going to pitch. Usually, identifying the editorial staff is pretty simple. In the case of this CountryLiving.com site, they have an About page, which shows us who their editors are: Next, start researching the various editors. Do they publish on the site? Read what they've written. Are they active on social media? Start following them there. Advance points for establishing credibility by having meaningful interactions with them about their articles in their social feeds before ever sending them a pitch. At a minimum, make sure you learn what you can about their likes and dislikes. Step 6: Craft the pitchFinally, we get to write our pitch! Steps 1 through 5 are about making this step the easiest of them all. Let's start with three rules: Personalize every pitch. No automatic pitch-building whatsoever. Know what your key value proposition is, and lead with it. Keep it short. Get right to the point, and don't waste their time. Those are the three most important things to remember. To satisfy rule two, start figuring out what the lead of your pitch is. Brought in a well-known expert? Lead with that. Groundbreaking study? Lead with that. Filling a void in the content-published-to-date on the target site? Lead with that. This is where your pitch is won or lost. The major pitch elements are: Your lead value proposition up front. Something that shows you've done your homework. The specific nature of the request. Additional required background. Here's an example of a pitch: Even though I've included some areas that need filling in, don't confuse this with being a template that you auto-populate. The comments you make on what they've already published and the nature of what you're suggesting to them are all custom. Also, if your author or your business is really well-known, then that might be the lead value proposition, rather than the content. In that case, lead with those facts, cover the proposed article topic in the second paragraph, and structure the email differently. SummaryAs I noted in the beginning, successful pitches are all about the preparation. Treat each opportunity to pitch someone as special and rare. After all, if you sent them a crappy pitch, and it shows you didn't put in any special effort, you may have burned that bridge permanently. That can be very costly, especially as your reputation and visibility continues to grow over time. Do all the upfront work correctly, and the effectiveness of your content marketing efforts will be greatly amplified. We all like to get an edge on our competition, and one of the best ways to do that in content marketing is to master and perfect your pitching process. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  17. Drugie (chronologicznie pierwsze) z planowanych spotkań z czytelnikami odbędzie się w Warszawie 7 listopada 2015 (sobota). Rozpocznie się o godzinie w samo południe i potrwa od rozpoczęcia do zakończenia ;> (salę będziemy mieli do ~16:00). Tym razem rejestracja jest niestety obowiązkowa, co wynika z ograniczonej (murami) wielkości sali (FCFS). Utrudnienia są niefortunne, niemniej jednak jestem bardzo ciekaw tego spotkania - będzie to moje pierwsze spotkanie z czytelnikami z planowanych trzech (dwa dni później odbędzie się jedno w Krakowie, a trochę ponad tydzień później we Wrocławiu, choć to ostatnie cały czas wisi pomiędzy "planami" a "realizacją"). Zapraszam więc na spotkanie :) Spotkanie autorskie: Organizator: PWN (dzięki!) Rejestracja: click (RSVP, FCFS) Kiedy: 7 listopada, 12:00-16:00 (godzina zakończenia jest czysto orientacyjna) Gdzie: Warszawa, ... (dokładne miejsce podamy niedługo) Plan (wstępny): ■ Prelekcja autora (moja) o książce i o programowaniu - 45 min ■ Przerwa na herbatę/kawę i poczęstunek - 15-30 min ■ Sesja Q&A - do zakończenia Do zobaczenia w Warszawie! P.S. Wrzuciłem na YT krótki podcast o książce - m.in. powiedziałem po kilka słów o tym co można znaleźć w każdym rozdziale. Następny planowany podcast będzie techniczny - yay!.Wyświetl pełny artykuł
  18. BlockingCollection jest specjalną kolekcją danych, przygotowaną do implementacji wzorca producent-konsument. Nakład pracy do implementacji tego wzorca jest minimalny z BlockingCollection. Nie musimy martwić się o synchronizację, sekcję krytyczną czy deadlock. Zacznijmy od razu od przykładu. Producent będzie wyglądać następująco: Jak widzimy, implementacja producenta to nic innego jak dodawanie danych do kolekcji. Metoda Add jest thread-safe […]Wyświetl pełny artykuł
  19. Posted by randfish It's a fact of life: we get better at what we do with time. Do you use that to your advantage when it comes to your site's content? Whether you're riding the wave of a successful post or improving what you've done before, republishing is something that should be on your mind and your to-do list. And what's more, Google will actually reward you for doing it! In this Whiteboard Friday, Rand explores the how and why of republishing, helping you set goals for yourself and your content. Click on the whiteboard image above to open a high resolution version in a new tab! Video TranscriptionHowdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're chatting about republishing and why Google rewards republishing so much. I think this is actually an underutilized tactic in SEO and in content creation. We have this idea that we make our content checklist, and we say, "All right, these are the topics I need to cover. These are the keywords I'm targeting, and this is the audience I'm trying to reach." Then once you create that piece of content, you kind of go, "All right, let's see how it does." Then, however it does, you kind of go, "Okay, let's try and make the next piece better. We already took our shot at that piece of content." But that does not have to be how it is. Google actually rewards republishing. So do, by the way, audiences. If a piece of content is a hit or if you're sure that a piece of content could be a hit, or that an audience would appreciate and enjoy it, I guarantee they're going to appreciate and enjoy it if you update that piece of content or produce something better on that topic. Same thing is true of social media. You can see a lot of the big content sites, particularly those that are very, very successful these days, your BuzzFeeds and that kind of thing, doing pieces of content over and over and over again. Basically, finding a formula, hitting it, and updating that content once they do it. So, let me talk about why this happens. I'm using the example of guinea pig food, as opposed to guinea pigs for food. You could do either way. Here I've registered the domain name -- I haven't actually -- RandsFurryFriends.com. I've got my guinea pig food content that I put in my guinea pig section on their diet. That was produced in 2010. But five years later, I'm going, "Man, that content is getting old. It's not performing the way I want it to. I'm going to publish a new piece of content targeting those same keywords, on the blog this time of RandsFurryFriends.com, and that's going to get October 15th or whenever that's coming out." This works really well because Google does a few things here. Why It WorksFirst off, they're often testing. They're verifying: When a piece of content comes out, did it do well? Google might place that piece of content on the first page of the results and then see how it performs with a small subset of searchers. That could be personalized, or that subset could be determined a bunch of different ways. But if it performs well, if it's the case that we really liked how engagement looked on this SERP, a lot of people were clicking on that link, they weren't clicking the Back button, they seemed to be happy with the results, then Google's going to say, "Hey, maybe that page deserves to stay here long-term." If you never republish, you don't know whether the problem was that you didn't earn the engagement and the user happiness and searcher happiness that Google needed in order to keep you on that front page. Maybe you had all the other ranking signals you would have needed, but you just didn't get there with searcher engagement. Fresh publishing often provides its own rankings boost. You can see this generally speaking. So Russ Jones from Moz has done an analysis of SERPscape and seen that in queries where Google is showing dates on multiple results in the SERPs, there tends to be a high correlation with positive rankings performance and showing a recent date. So we know that Google really likes fresh content for certain kinds of search queries, and it's almost certainly the case that even for those where it's not providing a massive boost, it's providing some value. Being the most recent on a topic is probably going to give you some value and benefit. So that's another one that helps us here. When you publish multiple times, you're building up that topical authority, that topical association that Google has with your site. So they might say, "Huh, Rand's Furry Friends offers a lot of content, but he publishes quite often and quite in-depth about guinea pigs in particular, and so maybe we should start associating Rand's Furry Friends with guinea pigs and show him for more and more guinea pig-types of search queries." That can broaden the reach of any given particular piece of content to the keyword universe that you can potentially rank for, which again, awesome. Really nice to have that. Multiple pieces of content tend to yield multiple opportunities to earn links, earn amplification, earn those social shares, earn engagement, and earn ranking signals of all kinds. So when I produce this, I've got another shot at reaching my audience and getting all the signals, all the links, and all the stuff that I need to rank well if I didn't do it the first time. Or I can do it additionally. Over time, you or your content team, you're going to get better at this. Five years ago, I guarantee, the content that I created for Whiteboard Friday, which you're watching right now, for our blog, it was not nearly as polished, as high quality as what you're getting to experience today on the Moz blog. We've gotten better at this stuff. Even our hits from 2010 are not as good as some of our good content in 2015 or 2014, because we're improving. This is going to be true for you as well.Potential ProcessesThere are three different ways, potential processes that you can go about when you're doing the republishing thing. These shouldn't all be done together. You should choose the one that makes the most sense for you and your situation. So first off, (A) multiple pieces that are published one after another—that time frame could be anything between them—targeting slight keyword variations and slight content variations. So right here I've got the 10 foods your guinea pig will love and guinea pig food, just the broad article. I might actually link to each of these between the two of them. This one, it's a little more listicle-kind of format. This one's a little more informational, knowledge-based. The idea, hopefully, what I really want to do is get one of these ranking in the top two or three results. Then once I produce the other one, if it ranks on page one, we know how Google treats that. They'll put it directly below. So they won't have you rank number two and you rank number eight. No, you'll rank number two, and if you rank number eight, boom, they'll bump you up to rank number three. So now I dominate two and three in the top three results. That's going to boost my click-through rate. That's going to give me a ton of opportunity to earn those searchers. Just awesome. That's the dominate search results approach. (B) is replacing old content with new. So essentially, I've produced XA, and I'm going to replace it with XB. So I might say, "This page, I'm putting this content on there. The URL is going to stay the same." The idea being I'm updating and improving that content. I have a second chance at earning links, earning amplification signals, and hopefully getting better engagement. Maybe if I'm already ranking well, I can improve that. I do this a lot with Moz blog posts. If I get an email from someone and I'm referencing an old post, and I notice that old post is just a little messy or not exactly what I'd offer today, I'll go in and update it. Sometimes that only takes me 10 or 15 minutes. Sometimes it takes me an hour or two. But then I can broadcast it again. I can tweet it. I can put in on LinkedIn. I put it on Google+. I put it on Facebook. I share it around. That broadcast activity often earns lots of new links pointing to it, and I see that pretty consistently, at least with my audience. (C) I can redirect old content to new. So potentially I can say, "Hey, you know what? I'm producing this new piece on 10 neat foods your guinea pig will love. This old article I just don't love anymore, but I want to get the rankings benefit and all the signals to this new page that this old one has." So all these links and wonderful things that were coming in here, I want to redirect them, and so I'm going to use a 301 to point A over to B.This has worked for us many, many times with big content pieces that we've produced here at Moz, everything from the Ranking Factors to our industry survey to lots and lots of other things. We'll even do this when we produce a new blog post that is really replacing an old one. We'll go ahead and 301 redirect, or potentially rel=canonical that old one, so make sure that old one is still accessible for someone if they want to see the historical version, but send all the ranking signals, all the links, and all the traffic to the new one. Like I said, these three, you should choose which goal you're trying to solve and then pick the republishing process that works best for you. All right everyone. Look forward to seeing you in the comments and to seeing you again next week for another edition of Whiteboard Friday. Take care. Video transcription by Speechpad.com Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  20. Posted by MarieHaynes Thin content! Duplicate content! Everyone knows that these are huge Panda factors. But are they really? In this article, I will explore the possibility that Panda is about so much more than thin and duplicate content. I don’t have a list of ten steps to follow to cure your Panda problems. But, I do hope that this article provokes some good discussion on how to improve our websites in the eyes of Google’s Panda algorithm. The duplicate content monster Recently, Google employee John Mueller ran a webmaster help hangout that focused on duplicate content issues. It was one of the best hangouts I have seen in a while—full of excellent information. John commented that almost every website has some sort of duplicate content. Some duplicate content could be there because of a CMS that sets up multiple tag pages. Another example would be an eCommerce store that carries several sizes of a product and has a unique URL for each size. He also said that when Google detects duplicate content, it generally does not do much harm, but rather, Google determines which page they think is the best and they display that page. But wait! Isn’t duplicate content a Panda issue? This is well believed in the SEO world. In fact, the Moz Q&A has almost 1800 pages indexed that ask about duplicate content and Panda! I asked John Mueller whether duplicate content issues could be Panda issues. I wondered if perhaps duplicate content reduced crawl efficiency and this, in turn, would be a signal of low quality in the eyes of the Panda algorithm. He responded saying that these were not related, but were in fact two separate issues: The purpose of this post is not to instruct you on how to deal with duplicate content. Google has some good guidelines here. Cleaning up your duplicate content can, in many cases, improve your crawl efficiency—which in some cases can result in an improvement in rankings. But I think that, contrary to what many of us have believed, duplicate content is NOT a huge component to the Panda algorithm. Where duplicate content can get you in trouble is if you are purposely duplicating content in a spammy way in order to manipulate Google. For example, if a huge portion of your site consisted of articles duplicated from other sources, or if you are purposely trying to duplicate content with the intent of manipulating Google, then this can get you a manual penalty and can cause your site to be removed from the Google index: These cases are not common, though. Google isn't talking about penalizing sites that have duplicate product pages or a boatload of Wordpress tag pages. While it's always good to have as clean a site as possible, I'm going to make a bold statement here and say that this type of issue likely is not important when it comes to Panda. What about thin content? This is where things can get a little bit tricky. Recently, Google employee Gary Illyes caused a stir when he stated that Google doesn’t recommend removing thin content but rather, beefing up your site to make it “thick” and full of value. Jen Slegg from The SEM Post had a great writeup covering this discussion; if you’re interested in reading more, I wrote a long post discussing why I believe that we should indeed remove thin content when trying to recover from a Panda hit, along with a case study showing a site that made a nice Panda recovery after removing thin content. The current general consensus amongst SEOs who work with Panda-hit sites is that thin content should be improved upon wherever possible. But, if a site has a good deal of thin, unhelpful pages, it does make sense to remove those pages from Google’s index. The reason for this is that Panda is all about quality. In the example which I wrote about where a site recovered from Panda after removing thin content, the site had hosted thousands of forum posts that contained unanswered questions. A user landing on one of these questions would not have found the page helpful and would likely have found another site to read in order to answer their query. I believe that thin content can indeed be a Panda factor if that content consistently disappoints searchers who land on that page. If you have enough pages like this on your site, then yes, by all means, clean it up. Panda is about so much MORE than duplicate and thin content While some sites can recover from Panda after clearing out pages and pages of thin content, for most Panda-hit sites, the issues are much deeper and more complex. If you have a mediocre site that contains thousands of thin pages, removing those thin pages will not make the site excellent. I believe Panda is entirely about excellence. At Pubcon in Vegas, Rand Fishkin gave an excellent keynote speech in which he talked about living in a two-algo world. Rand spoke about the “regular algorithm,” which, in years past, we've worked hard to figure out and conquer by optimizing our title tags, improving our page speed, and gaining good links. But then he also spoke of a machine learning algorithm. When Rand said “We’re talking about algorithms that build algorithms,” something clicked in my head and I realized that this very well could be what's happening with Panda. Google has consistently said that Panda is about showing users the highest-quality sites. Rand suggested that machine learning algos may classify a site as a high quality one if they're able to do some of the following things: Consistently garner a higher click-through rate than their competitors. Get users to engage more with your site than others in your space. Answer more questions than other sites. Earn more shares and clicks that result in loyal users. Be the site that ultimately fulfills the searcher's task.There are no quick ways to fulfill these criteria. Your site ultimately has to be the best in order for Google to consider it the best. I believe that Google is getting better and better at determining which sites are the most helpful ones to show users. If your site has been negatively affected by Panda, it may not be because you have technical on-site issues, but because your competitors’ sites are of higher overall quality than yours. Is this why we're not seeing many Panda recoveries? In mid- to late 2014, Google was still refreshing Panda monthly. Then, after October of 2014, we had nine months of Panda silence. We all rejoiced when we heard that Google was refreshing Panda again in July of 2015. Google told us it would take a while for this algo to roll out. At the time of writing this, Panda has been supposedly rolling out for three months. I’ve seen some sporadic reports of mild recoveries, but I would say that probably 98% of the sites that have made on-site quality changes in hopes of a Panda recovery have seen no movement at all. While it’s possible that the slow rollout still hasn’t affected the majority of sites, I think that there's another frightening possibility. It's possible that sites that saw a Panda-related ranking demotion will only be able to recover if they can drastically improve the site to the point where users GREATLY prefer this site over their competitors’ sites. It is always good to do an on-site quality audit. I still recommend a thorough site audit for any website that has suffered a loss in traffic that coincides with a Panda rerun date. In many cases, fixing quality issues—such as page speed problems, canonical issues, and confusing URL structures—can result in ranking improvement. But I think that we also need to put a HUGE emphasis on making your site the best of its kind. And that’s not easy. I've reviewed a lot of eCommerce sites that have been hit by Panda over the years. I have seen few of these recover. Many of them have had site audits done by several of the industry’s recognized experts. In some cases, the sites haven't recovered because they have not implemented the recommended changes. However, there are quite a few sites that have made significant changes, yet still seem to be stuck under some type of ranking demotion. In many cases like this, I've spent some time reviewing competitors’ sites that are currently ranking well. What I’ll do is try to complete a task, such as searching for and reaching the point of purchase on a particular product on the Panda hit-site, as well as the competitors’ sites. In most cases, I’ll find that the competitors offer a vastly better search experience. They may have a number of things that the Panda-hit site doesn't, such as the following: A better search interface. Better browsing options (i.e. search by color, size, etc.) Pictures that are much better and more descriptive than the standard stock product photos. Great, helpful reviews. Buying guides that help the searcher determine which product is best to buy. Video tutorials on using their products. More competitive pricing. A shopping cart that's easier to use.The question that I ask myself is, “If I were buying this product, would I want to search for it and buy it on my clients’ site, or on one of these competitors’ sites?” The answer is almost always the latter. And this is why Panda recovery is difficult. It’s not easy for a site to simply improve their search interface, add legitimate reviews that are not just scraped from another source, or create guides and video tutorials for many of their products. Even if the site did add these features, this is only going to bring them to the level where they are perhaps just as good as their competitors. I believe that in order to recover from Panda, you need to show Google that by far, users prefer your website over any other one. This doesn’t just apply to eCommerce sites. I have reviewed a number of informational sites that have been hit by Panda. In some cases, clearing up thin content can result in Panda recoveries. But often, when an informational site is hit by Panda, it’s because the overall quality of the content is sub-par. If you run a news site and you’re pushing out fifty stories a day that contain the same information as everyone else in your space, it’s going to be hard to convince Google’s algorithms that they should be showing your site’s pages first. You’ve got to find a way to make your site the one that everyone wants to visit. You want to be the site that when people see you in the SERPS, even if you’re not sitting at position #1, they say, “Oh…I want to get my news from THAT site…I know them and I trust them…and they always provide good information.” In the past, a mediocre site could be propelled to the top of the SERPS by tweaking things like keywords in title tags, improving internal linking, and building some links. But, as Google’s algorithms get better and better at determining quality, the only sites that are going to rank well are the ones that are really good at providing value. Sure, they’re not quite there yet, but they keep improving. So should I just give up? No! I still believe that Panda recovery is possible. In fact, I would say that we're in an age of the Internet where we have much potential for improvement. If you've been hit by Panda, then this is your opportunity to dig in deep, work hard, and make your site an incredible site that Google would be proud to recommend. The following posts are good ones to read for people who are trying to improve their sites in the eyes of Panda: How the Panda Algorithm Might Evaluate Your Site – A thorough post by Michael Martinez that looks at each of Amit Singhal’s 23 Questions for Panda-hit sites in great detail. Leveraging Panda To Get Out Of Product Feed Jail – An excellent post on the Moz blog in which Michael Cottam gives some tips to help make your product pages stand out and be much more valuable than your competitors’ pages. Google’s Advice on Making a High-Quality Site – This is short, but contains many nuggets. Case Study – One Site’s Recovery from an Ugly SEO Mess – Alan Bleiweiss gives thorough detail on how implementing advice from a strong technical audit resulted in a huge Panda recovery. Glenn Gabe’s Panda 4.0 Analysis – This post contains a fantastic list of things to clean up and improve upon for Panda-hit sites. If you have been hit by Panda, you absolutely must do the following: Start with a thorough on-site quality audit. Find and remove any large chunks of thin content. Deal with anything that annoys users, such as huge popups or navigation that doesn’t work.But then we have to do more. In the first few years of Panda’s existence, making significant changes in on-site quality could result in beautiful Panda recoveries. I am speculating though that now, as Google gets better at determining which sites provide the most value, this may not be enough for many sites. If you have been hit by Panda, it is unlikely that there is a quick fix. It is unlikely that you can tweak a few things or remove a chunk of content and see a dramatic recovery. Most likely, you will need to DRAMATICALLY improve the overall usefulness of the site to the point where it's obvious to everyone that your pages are the best choices for Google to present to searchers. What do you think? I am seriously hoping that I'm wrong in predicting that the only sites we'll see make significant Panda recoveries are ones that have dramatically overhauled all of their content. Who knows…perhaps one day soon we'll start seeing awesome recoveries as this agonizingly slow iteration of Panda rolls out. But if we don’t, then we all need to get working on making our sites far better than anyone else’s site! Do you think that technical changes alone can result in Panda recoveries? Or is vastly improving upon all of your content necessary as well? Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  21. Posted by RobBeirne We need to talk about bounce rate. Now, before I begin ranting, I'd just like to put on the record that bounce rate can, in certain cases, be a useful metric that can, when viewed in the context of other metrics, give you insights on the performance of the content on your website. I accept that. However, it is also a metric which is often misinterpreted and is, in a lot of cases, misleading. We've gone on the record with our thoughts on bounce rate as a metric, but it's still something that crops up on a regular basis. The problem with bounce ratePut simply, bounce rate doesn't do what a lot of people think it does: It does not tell you whether people are reading and engaging with your content in any meaningful way. Let's make sure we're all singing the same song on what exactly bounce rate means. According to Google, "Bounce Rate is the percentage of single-page sessions (i.e. sessions in which the person left your site from the entrance page without interacting with the page)." In simple terms, a bounce is recorded when someone lands on your website and then leaves the site without visiting another page or carrying out a tracked action (event) on the page. The reality is that while bounce rate can give you a useful overview of user behaviour, there are too many unknowns that come with it as a metric to make it a bottom-line KPI for your advertising campaigns, your content marketing campaigns, or any of your marketing campaigns, for that matter. When looked at in isolation, bounce rate gives you very little valuable information. There is a tendency to panic when bounce rate begins to climb or if it is deemed to be "too high." This highly subjective term is often used without consideration of what constitutes an average bounce rate (average bounce rate for a landing page is generally 70-90%). There's a school of thought that a high bounce rate can be seen as a good thing, as it means that the user found no need to go looking any further for the information they needed. While there is some merit to this view, and in certain circumstances it can be the case, it seems to me to be overly simplistic and opaque. It's also very important to bear in mind that if a user bounces, they are not included in site metrics such as average session duration. There is, however, a simple way to turn bounce rate into a robust and useful metric. I'm a big fan of adjusted bounce rate, which gives a much better metric on how users are engaging with your website. The solution: adjusted bounce rateEssentially, you set up an event which is triggered after a user spends a certain amount of time on the landing page, telling Google Analytics not to count these users as bounces. A user may come to your website, find all of the information they need (a phone number, for example) and then leave the site without visiting another page. Without adjusted bounce rate, such a user would be considered a bounce, even though they had a successful experience. One example we see frequently of when bounce rate can be a very misleading metric is when viewing the performance of your blog posts. A user could land on a blog post and read the whole thing, but if they then leave the site they'll be counted as a bounce. Again, this gives no insight whatsoever into how engaged this user was or if they had a good experience on your website. By defining a time limit after which you can consider a user to be 'engaged,' that user would no longer count as a bounce, and you'd get a more accurate idea of whether they found what they were looking for. When we implemented Adjusted Bounce Rate on our own website, we were able to see that a lot of our blog posts which had previously had high bounce rates, had actually been really engaging to those who read them. For example, the bounce rate for a study we published on Facebook ad CTRs dropped by 87.32% (from 90.82% to 11.51%), while our Irish E-commerce Study dropped by 76.34% (from 82.59% to 19.54%). When we look at Moz's own Google Analytics for Whiteboard Friday, we can see that they often see bounce rates of over 80%. While I don't know for sure (such is the uncertainty surrounding bounce rate as a metric), I'd be willing to bet that far more than 20% of visitors to the Whiteboard Friday pages are interested and engaged with what Rand has to say. This is an excellent example of where adjusted bounce rate could be implemented to give a more accurate representation of how users are responding to your content. The brilliant thing about digital marketing has always been the ability of marketers to make decisions based on data and to use what we learn to inform our strategy. Adjusted bounce rate gives us much more valuable data than your run-of-the-mill, classic bounce rate. It gives us a much truer picture of on-site user behaviour. Adjusted bounce rate is simple to implement, even if you're not familiar with code, requiring just a small one-line alteration to the Google Analytics code on your website. The below snippet of code is just the standard Google Analytics tag (be sure to add your own tracking ID in place of the "UA-XXXXXXX-1"), with one extra line added (the line beginning with "setTimeout", and marked with an "additional line" comment in the code). This extra line is all that needs to be added to your current tag to set up adjusted bounce rate. <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-XXXXXXX-1']); _gaq.push(['_trackPageview']); setTimeout("_gaq.push(['_trackEvent', '15_seconds', 'read'])",15000); // --additional line (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> It's a really simple job for your developer; simply replace the old snippet with the one above (that way you won't need to worry about your tracking going offline due to a code mishap). In the code above, the time is set to 15 seconds, but this can be changed (both the '15_seconds' and the 15000) depending on when you consider the user to be "engaged". This ‘15_seconds’ names your event, while the final part inside the parenthesis sets the time interval and must be input in milliseconds (e.g. 30 seconds would be 30000, 60 seconds would be 60000, etc.). On our own website, we have it set to 30 seconds, which we feel is enough time for a user to decide whether or not they're in the right place and if they want to leave the site (bounce). Switching over to adjusted bounce rate will mean you'll see fewer bouncers within Google Analytics, as well as improving the accuracy of other metrics, such as average session duration, but it won't affect the tracking in any other way. Adjusted bounce rate isn't perfect, but its improved data and ease of implementation are a massive step in the right direction, and I firmly believe that every website should be using it. It helps answer the question we've always wanted bounce rate to answer: "Are people actually reading my content?" I firmly believe that every website should be using adjusted bounce rate. Let me know what you think in the comments below. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! Wyświetl pełny artykuł
  22. O asynchronicznych kontrolerach pisałem już na blogu, zarówno w czystym ASP.NET jak i ASP.NET MVC. Czasami jednak chcemy zaimplementować model na wzór “fire&forget”. Oczywiście do tego, dużo bardziej nadają się systemy kolejkowe typu nServiceBus, ale dla bardzo prostych przypadków wystarczy odpalenie wątku i wykonanie jakieś czasochłonnej operacji. Przez czasochłonną mam na myśli taką, która wykonuje […]Wyświetl pełny artykuł
  23. W dzisiejszym poście chciałbym podsumować poprzednie rozważania za pomocą wskazówek jak bronić się przed SQL Injection. 1. Unikanie dynamicznych zapytań Podstawowa rzecz to oczywiście pisanie zapytań w taki sposób, aby niemożliwe było zmanipulowanie parametrów. Poniższy kod jest skrajnie złą sytuacją: Bezpiecznym sposobem jest użycie sparametryzowanych zapytań tzn.: Każdy sterownik baz danych dostarcza analogiczne API do […]Wyświetl pełny artykuł
  24. Jak pisałem wczoraj na FB, spotkanie autorskie w Krakowie jest już pewne i w trakcie organizacji. Odbędzie się ono 9 listopada podczas Sekurak Hacking Party - cyklicznych spotkań organizowanych przez jednego z partnerów książki - serwis Sekurak. Otwarta jest już rejestracja - wstęp jest darmowy, a sama rejestracja nieobowiązkowa (więc jeśli ktoś zapomni, to nie ma problemu); zachęcam natomiast do rejestracji - pozwoli nam to na oszacowanie jak duża sala jest potrzebna. W listopadzie odbędą się jeszcze dwa spotkania autorskie w innych miastach - (wstępnie) 6 listopada w Warszawie, oraz w drugiej połowie listopada we Wrocławiu; więcej informacji o nich wrzucę jak tylko będą w miarę pewne. Sekurak Hacking Party + spotkanie autorskie: Rejestracja: click Kiedy: 9 listopada, 17:45-20:45 Gdzie: Kraków, ... (dokładne miejsce podamy niedługo) Plan (wstępny): ■ Intro - Michał Sajdak ■ Prelekcja "Wybrane błędy zgłoszone w ramach Google Bug Bounty" - Michał Bentkowski ■ Prelekcja - Mateusz Jurczyk ■ Spotkanie autorskie z Gynvaelem Coldwindem, autorem książki "Zrozumieć Programowanie" ● Prelekcja "O książce, o programowaniu, o bezpieczeństwie" ● Sesja Q&A Chciałbym przy okazji podziękować za zaufanie osób, które kupiły „Zrozumieć Programowanie” w przedsprzedaży - zainteresowanie przekroczyło moje najśmielsze szacunki - dzięki! :) Przedsprzedaż jest nadal aktywna btw - więcej pisałem o tym w poprzednim poście. PS. Standardowo jestem spóźniony z nową serią podcastów, ale no worries, nic się nie zmieniło, nadal są planowane i niedługo się pojawią ;)Wyświetl pełny artykuł
  25. Resharper daje naprawdę cenne wskazówki. Nie wszystkie są oczywiste i czasami należy zagłębić się w temat. Jedną z takich wskazówek jest używanie IndexOf wraz z StringComparison.Ordinal. Załóżmy, że mamy następujący kod: Resharper zasugeruje konwersję do: Dlaczego? Jeśli nie przekażemy ustawień regionalnych jawnie, wtedy domyślnie aktualna zostanie użyta. Czasami oczywiście dokładnie tego chcemy i dlatego domyślnie przekazywany […]Wyświetl pełny artykuł
×
×
  • Utwórz nowe...