Archive for the ‘frequency’ tag
“The stock market today is a war zone, where algobots fight each other over pennies, millions of times a second…inevitably, at some point in the future, significant losses will end up being borne by investors with no direct connection to the HFT world, which is so complex that its potential systemic repercussions are literally unknowable.” Felix Salmon
I’ve written about algorithms before. I think it’s inevitable that the trading of media space will become ever more automated. Price customisation software will play an ever bigger role in the optimisation of pricing. And I think algorithms are fundamental to the future of content. But what about when they go wrong?
A fortnight ago, a computerised trading programme belonging to leading US stock broker Knight Capital Group ‘ran amok’, with staffers at Knight unable to stop it trading for more than half an hour. The result was a near fatal $440 million loss, the company kept alive only by emergency financing, and now in a position where it is likely to have to sell off parts of its business to keep going.
On May 6th 2010 in the so-called Flash Crash, algorithmic trading contributed to the second largest point swing and the biggest one-day point decline in the history of the Dow Jones Industrial Average, as it lost about 9% of its value, only to recover those losses within a matter of minutes. A joint report by the U.S. Securities and Exchange Commission and the Commodity Futures Trading Commission identified how an unusually large sell-off of E-Mini S&P 500 contracts by a large mutual fund firm had initially exhausted available buyers, but then off the back of that how high-frequency algorithmic traders had then started aggressively selling, accelerating the effect of the mutual fund’s selling. The report portrayed “a market so fragmented and fragile that a single large trade could send stocks into a sudden spiral.”
Such algorithmic trading is more common than you might think. As of 2009, High Frequency Trading (HFT) firms accounted for 73% of all US equity trading volume. HFT uses algorithms to make highly complex decisions at lightening speed before human traders are capable of processing the same information. Automated trades are used on the buy side (by pension funds and mutual funds for example) to sub-divide large trades to minimise market impact and risk (and in some sense hide what they’re doing), and on the sell side (by so-called market makers and hedge funds) to provide liquidity to the market. Many however, have questioned the value of that liquidity saying that it “has a rather ghostly quality and tends to vanish when needed most”.
The animated GIF above shows the amount of high-frequency trading in the stock market from January 2007 to January 2012. It shows not only the rise in HFT over that time but a world that, as Felix Salmon of Reuters noted, “in aggregate seemingly has a mind of its own when it comes to trading patterns”. The stock market, says Salmon, is clearly more dangerous than it was in 2007 incorporating a much greater tail risk and yet in return for facing that danger, “society as a whole has received precious little utility”.
Automation and algorithms are changing the structure of our markets. That much is perhaps inevitable. But is it right that in the quest for speed and frequency we are building the kind of systemic risk whose scale may be unknown but which could well impact far outside the domain of the financial markets?Personally, I think not.
HT to @BBHLabs for the Felix Salmon link
Editor’s Notes: John C. Zolper, Ph.D. is the Vice President of Research and Development at Raytheon, an American corporation with core manufacturing concentrations in weapons and military and commercial electronics. So yeah, neat stuff.
I’ve got a riddle for you. What do Blu-ray disks, military radars and LED light bulbs have in common? Chances are, if you work outside of the defense or electronics sectors, you may not easily make the connection. But the common thread is a little-known technology called Gallium Nitride (GaN for short). GaN is evolving rapidly behind the scenes to transform many aspects of modern day life, while also serving vitally important roles within our nation’s military.
GaN is a wide band gap semiconductor material with special properties that are ideal for applications in optoelectronics, and high-power, high-frequency amplifiers. Aerospace and defense innovators have long recognized the critical competitive advantages GaN represents for high frequency electronics – including significant cost, size, weight and power reduction capabilities – and have spent years refining and continuously pushing GaN technology to new limits. For example, GaN is playing an integral role in developing more reliable military radars that can be five times more powerful than traditional systems or only half the size. In recent years, technologists across a number of commercial industries have taken notice of these pioneering innovations for the military, and have started putting GaN to work to power every day technologies in ways that significantly reduce energy costs and environmental impact.
Take the Blu-ray disc, for example. The next generation DVD is changing the way the world watches movies. Blu-ray discs store video and audio data packets in “pits,” or tiny grooves, which are about half the size of those in traditional DVDs. The tiny, highly accurate Blu-ray laser beam – powered by GaN-based violet laser diodes – can precisely read these hyperfine pits. This enables closer spacing of data packets and up to five times the storage capacity of a traditional DVD (roughly 27 GB of data). GaN technology enables higher resolution for the crystal clear imagery modern movie buffs have come to expect. With support from two of the world’s largest PC manufacturers, HP and Dell, Blu-ray technology is poised as the next-generation optical disc format – with potential to increase PC data storage exponentially in the coming years.
You’ve likely seen the light bulb revolution that’s taking place, but may not have known gallium nitride is at the center of it. As traditional, century-old incandescent bulbs are slowly phased out by federal mandate, LED light bulbs represent the future of the lighting industry. A GaN-powered LED light bulb can easily outlast traditional bulbs by several years, while consuming a tenth of the power and reducing CO2 emissions by 90 percent. The Department of Energy recently commended Philips Lighting for creating a LED bulb that would last more than 20 years – an innovative design with the potential to save Americans a combined $3.9 billion in annual energy costs and reduce U.S. carbon emissions by 20 million metric tons. A number of young companies, including startup Sorra, remain focused on driving innovations in cost-effective LED lighting for the masses.
And LCD televisions, backlit by GaN-powered LED lighting, are thinner, lighter and up to 40 percent more energy-efficient than those using CCFL backlighting. In an effort to reduce the price point for consumers, pioneering companies such as Sony are now introducing the next wave of LED televisions, which will use edge-lit LED as the TV’s light source, reducing the number of LED lights required as compared to first generation LED televisions.
For mobile users, GaN can help ensure an affirmative answer to the old question, “Can you hear me now?” The efficiency and resistance to heat and electronic interference of microwave amplifiers built with GaN enables broader, more reliable cellular coverage, while eliminating the need for power-sucking cooling fans required by older cell phone tower technologies. RFHIC Corp of Suwon, South Korea, which makes GaN-based radio frequency and microwave components for telecommunications and broadcasting industries, estimates U.S. carriers could save approximately $2 billion per year by using GaN technology for their wireless infrastructures. Large carriers, including Sprint, have already launched GaN-powered towers in several markets.
While GaN-powered technologies quickly evolve to alter many aspects of modern day life, GaN electronics are expected to play an increasingly more important role within our nation’s military systems. Raytheon has been awarded a contract by the Defense Advanced Research Projects Agency (DARPA) to develop next-generation GaN electronic devices bonded to diamond substrates, which is expected to triple current GaN circuit capabilities. The application of a markedly more efficient GaN-on-diamond material is expected to significantly benefit next-generation radar, communications and electronic warfare systems that employ GaN-based radio frequency devices.
When you think of how much technology is empowered by a tiny microchip, it’s not hard to imagine how GaN will rapidly accelerate innovation across numerous industries in the years ahead. Undoubtedly, future innovators will find new ways to apply GaN technology to our iPads and smartphones, bringing the networked world to consumers’ fingertips more quickly and effortlessly. Companies from start-ups to larger enterprises looking to revolutionize their industries would do well to consider how GaN can drive innovation within their business models. In the meantime, rest assured, innovators, investors and military engineers are already hard at work, staging the next technical revolution.
Until now, every Facebook mobile ad had to be triggered by you or a friend’s activity, but today Facebook begins testing a new non-social ad unit that lets developers buy mobile news feed ads that open Android and iOS App Store purchase pages when clicked. They’re designed to help developers grow their business. The ads are cost per click, not cost per install-based, however Facebook tells me it hopes to let devs measure installs driven by their app ads in the future, and is now taking developer signups for the currently small private beta.
By opening up the mobile news feed to traditional, non-social ads, Facebook will have to be very careful about how often these promotions appear to make sure they don’t drown out organic content and cause us to stick our phones right back in our pockets.
Facebook mobile app ads appear in a “Try These Games” panel in between traditional stories on the mobile news feed. The panel shows the name, thumbnail image, and number of friends playing (if any) of a few app (three in the example we’ve received). Organic and paid entries into the panel can appear side-by-side, with ads marked “sponsored”. Clicking through opens an app’s native iOS App Store application or Google Play application on your phone or tablet.
Facebook mobile ad reach, clicks, frequency, and spend can be tracked through a dashboard, and the ads can take advantage of all of Facebook’s biographical, interest, and device targeting options. This makes them much more flexible than Sponsored Stories, which advertisers could only target to friends of people who had already mentioned their brand or used their app. That means developers won’t need an existing user base to advertise their apps, and they can be employed to promote game launches — currently a huge source of developer ad spend on Facebook’s website.
For example, ads for an new iOS-only girl’s fashion game could be targeted to iOS device-carrying females 16 to 45 years old, living in Los Angeles to maximize the relevance.
Wall Street should be pleased to see Facebook getting more aggressive about mobile monetization. But the fact that these are just straight-up ads, not stories about friends that businesses pay to appear more frequently, brings Facebook into murky waters on mobile. It’s previously relied on the idea that its social ads are content to justify their injection into the news feed.
That’s why it’s a little frightening that Facebook told me, “It’s hard to say if there’s going to be a frequency limit” to how often mobile app ads appear. Though it did say determining if a limit is needed is part of what its watching for in the beta and that “we don’t want to show too much sponsored content because that would be the wrong experience for news feed.” Facebook will be testing and we’ll be watching to make sure users don’t rebel because it’s diluting a feed originally for friends’ photos and status updates with paid ads for random games.
It’s been a very long time coming, but the UK regulator Ofcom has finally revealed plans for the auction of 4G spectrum, which means that by late next year the UK may, finally, start to see a commercial rollout of 4G services like LTE. Bidding in the auction, for spectrum in the 800 MHz and 2.6 GHz bands, is likely to start in early 2013. It will be the largest-ever spectrum auction in the country, some 80 percent bigger than the 3G auction that saw billions of pounds spent invested by operators in 2000. And ultimately, the sale will mean at least 98 percent of the UK will have access to mobile broadband.
The regulator has been performing a fine balancing act over the last couple of years on this issue — with larger operators O2, Vodafone and T-Mobile/Orange (the Everything Everywhere JV), wanting to ensure they get sizeable shares of spectrum to serve their existing customer bases, doing battle with smaller operators worried about getting shut out of the process. In the past, that has spelled frustrating delays for all concerned. Today Ofcom hit back, saying “reports of delays are way off the mark.” Its solution to the bunfight? It will give a crack at the spectrum to the three biggies, but would also reserve a tranche for a fourth party, a wholesaler like Hutchison 3G or someone else.
Ofcom makes a very big point of not mentioning what technologies will be used on the 4G spectrum. Mobile operators are almost certainly going to go with LTE services, but there is also the possibility of WiMax in its many forms, mesh networks and more.
The 4G auction has been designed to deliver the maximum possible benefit to consumers and citizens across the UK,” Ed Richards, Ofcom Chief Executive, said in a statement. “As a direct result of the measures Ofcom is introducing, consumers will be able to surf the web, stream videos and download email attachments on their mobile device from almost every home in the UK.”
Ofcom notes that the 4G auction will offer at least two spectrum bands – 800 MHz and 2.6 GHz, with the lower frequency 800 MHz band part of the ‘digital dividend’, which is ideal for widespread mobile coverage (this is the spectrum appropriated from broadcasters). The higher frequency 2.6 GHz band will have a shorter range but will be able to carry faster speeds. Together the spectrum will add up to 250 MHz of additional mobile spectrum, Ofcom says. Today there is 333 MHz in use.
Ofcom has also started to lay out some of the requirements for coverage:
– The spectrum will be released in “lots” and one of the 800 MHz lots will carry an obligation to provide a mobile broadband service for indoor reception to at least 98% of the UK population by the end of 2017 at the latest.
– There will be a requirement to ensure 95 percent coverage indoors as well among each UK nation — England, Northern Ireland, Scotland and Wales.
– Another consultation is being set for September 11 for the “legal instrument” for implementing the auction.
– The full whack of Ofcom documents on the 4G spectrum sale — including details on specific spectrum tranches and regional rollouts — can been accessed here.
[Image: Mark Fischer, Flickr]
The email marketing landscape has changed. Most marketers are already on board with the importance of email marketing, and are working hard to refine the nuances of their strategy — including figuring out what the heck the right sending frequency is for their email campaigns. Does sending way more emails make you a spammer? Does sending less email result in potential revenue loss?
Well, it’s time to analyze both sides of the email sending frequency issue further by hosting another marketing debate. And here at HubSpot, we have two employees with very opposing views on this subject.
Dan Zarrella, our social media scientist and author of Heirarchy of Contagiousness, claims that sending more emails is better. Meanwhile Sam Mallikarjunan, inbound marketing manager and co-author of an upcoming ecommerce marketing book believes excessive emailing leads to list attrition. The two of them will battle it out, defending their opinions live on air, Friday July 13th at 1PM EDT. I’ll be moderating this marketing debate, pulling in your questions through Twitter via #MKTGdebate.
To get a little perspective before the debate takes place, let’s evaluate some important email marketing statistics that will help fuel this debate. Does any of this change your opinion on ideal email sending frequency? Take a gander!
1) Emails targeted to customer loyalty programs have a 40% higher open rate.
Every email marketer worth his or her salt knows that segmentation is critical to email marketing effectiveness. But this stat could contribute to both Sam and Dan’s stance on the issue. How? Well, if targeted emails have a higher open rate, that should enable you to send more emails to that group as they are more receptive to your messages that contain such well-targeted content.
However, this statistic could also indicate that the a high email sending frequency to an already engaged list doesn’t necessarily mean you’re deriving any added value from your efforts; after all, they’re opening, not necessarily clicking. Plus, you should also be trying to engage the segments of your list that aren’t already your cheerleaders — those involved in a customer loyalty program aren’t at a high risk of abandoning your company for a competitor.
(Source: Experian, April 2012)
2) More than 80% of email marketers send the same content to all subscribers.
It’s interesting that this statistic sprouts from the same report as the stat above. Despite the fact that targeted sends have a 40% higher open rate, 80% of marketers are still emailing the same content to all subscribers! Dan could argue that this means marketers don’t even have to invest time in creating new email content since it isn’t hampering open rates; but Sam might argue (again) that open rates mean nothing without clicks, and email sends to those who are already customers do nothing to help grow your new customer base.
(Source: Experian, April 2012)
3) 69% of U.S. email users unsubscribe from a business or non-profit email because the organization sends too many emails.
This statistic clearly plays in favor of Sam’s stance: sending more email leads to attrition. However, the following graph (compiled from MailChimp data with 9.5 billion data points), shows the effect of email sending frequency on unsubscribe rate. According to this data, the more emails you send does not correlate with higher unsubscribe rates, indicating higher email frequencies won’t hurt your business. What it doesn’t confirm, however, is how many of these users receiving your emails have set up filters that send your emails to SPAM. Plus, we all know that not unsubscribing does not an engaged email recipient make — many users simply delete your message without bothering to unsubscribe.
(Source: CMB, March 2012)
4) 50% of consumers worldwide trust email messages from companies they have signed up to receive.
If consumers trust the influx of messages sent to their inboxes — AKA you’re only emailing people who have opted in to receive email communications from you — why should it matter how many messages they receive? But just because they trust the message doesn’t necessarily mean they’re not annoyed by the number of messages. If you’ve built a certain level of trust with your readers, why would you jeopardize that by bombarding them with everything you have to say?
(Source: Nielsen 2012)
5) 76.5% of commercial emails sent reached recipients’ inboxes in 2011, and email blocked and flagged as SPAM increased 24%.
The vast majority of emails are reaching their target destinations, so it’s safe to say that if you’re sending a lot of email, it’s landing in inboxes. But at the same time, the number of users marking content off as SPAM is also increasing. While that number may not be as extreme as the deliverability rate, it’s important to realize that recipients are marking more and more content as SPAM — whether that means they simply are getting more comfortable doing so, or their tolerance level for a cluttered inbox is decreasing is unclear. This fact, however, should prompt you to truly evaluate how your email deliverability rates correlate with your desired email sending frequency.
(Source: ReturnPath, 2011)
Whose side will you be on this Friday, and what are your opinions on email sending frequency?
The terms of the deal were not disclosed. However, unlike many acquisition agreements, Vizu’s tech will be integrated immediately into Nielsen’s offerings. Vizu’s Brand Effect suite will make it possible for Nielsen to give out real-time reports of online ad performance that is “broken out by media plan participant, frequency of ad exposure, advertising execution, and targeting strategy.” All of those are key to marketing online in the age of social media and real-time engagement.
“We are committed to providing a complete understanding of a brand’s end-to-end advertising campaign impact,” said Steve Hasker, president of global media products and ad solutions for Nielsen, in a statement. “Vizu has developed a best-in-class solution for measuring and optimizing brand advertising effectiveness online, which offers a powerful complement to Nielsen’s cross-platform solutions for the advertising industry. We are pleased to welcome Vizu’s talented team to Nielsen and are excited to work together to further evolve our Brand Effect product suite, and provide unparalleled capabilities to advertisers, publishers and agencies.”
Prior to Neilsen’s buy, San Francisco-based Vizu had raised $10.7 million from investors including Draper Fisher Jurvetson, iNovia Capital, Greycroft Partners, Ron Conway, and Esther Dyson.
Photo credit: Angela Waye/Shutterstock
On Wednesday we hosted our latest webinar, The Science of Email Marketing where our very own Dan Zarrella presented some juicy new email marketing data and insights. So juicy, in fact, that some equally juicy follow-up questions came rolling in from those listening to the webinar.
So we read through them all (yes, we actually read your questions!) and pulled out the 11 questions that were asked most frequently, and we thought everyone would benefit from hearing a little bit more about. So here they are, your top email marketing questions from Wednesday’s webinar, answered!
1) Should an email come from a person or a business for a better open rate?
This is an excellent question because it’s the subject of ongoing debate amongst email marketing — and that’s because there’s not one right answer that works for everyone. We performed an A/B test of our own, in fact, to see whether emails sent from the lovely Maggie (one of the people responsible for yesterday’s webinar, in fact!) performed better than emails sent from HubSpot:
As you can see, the treatment’s 0.96% click-through rate beat out the control’s 0.73% click-through rate — which also yielded us 131 more leads than our control. So it seems that for us, emails sent from a real person’s name are more likely to get clicked than emails sent by a company’s name.
Thing is, there’s a case to be made for the fact that your email recipients might know your company better than they know an individual within your company. We get that. That’s why it’s critical to perform an A/B test like this for yourself to determine which method is best for you.
2) How many characters do you suggest for the subject line of an email?
While some email clients display a bit more subject line characters than others, shoot to keep it under 50 characters, especially because many recipients will be reading on mobile devices that display even less of the subject line — often 20 characters or less. To deal with this discrepancy, make sure the beginning of your email subject line gives the recipient enough information to understand the contents of your email, just in case your subject line is cut off a bit prematurely.
3) Does using numbers or special characters in an email’s subject line impact its open rate?
Yes, though not enormously so. As detailed on the webinar, ampersands, brackets, and parentheses showed slightly higher click-through rates when included in the subject line, while things like question marks and hashtags (pound signs, if you’re still living in the 20th century) did appear to have some negative impact. When it comes to symbols like these, it’s not something that’s going to make or break your open rate, but if you can avoid the overzealous exclamation point, do it — especially because exclamation points often trigger emails to go into SPAM folders. Your subscribers are likely already desensitized to the typical displays of feigned emotion, so exclamation points and excessive use of capitalization will probably have no positive impact on your email.
4) What is considered a decent click-through rate for an email?
This is going to depend on what type of email you’re sending. As we’ve previously reported, transactional emails such as order receipts or confirmations have the highest click-through rate, followed by newsletters, with promotional emails having the lowest CTR of all. It makes sense — think of how much more engaged a brand new customer is with your brand (someone who might receive an order confirmation email), compared to someone who is just periodically staying up to date on your brand (someone who might receive a newsletter), compared to a lead (someone who might receive a promotional email). So a decent CTR for you is going to vary depending on what type of email you’re sending, and to which list.
That being said, eMarketer published the average email click-through rate of emails in North America, and found it was at 5.5% in Q3 of 2011, up slightly from the previous quarter. The thing is, this is across all industries and email types — so the data isn’t necessarily a proper benchmark for every marketer to measure their emails by. The key is to continue testing variables in your emails that will help improve your click-through rate, and I’d start with list segmentation. Segmenting your email lists is some seriously low-hanging fruit, and has shown to improve email relevance by 34%. And you know what more relevant content means — more clicks!
5) Is the click-through rate of business emails higher on mobile devices than on computers?
According to All Things Digital, in 2011 the average mobile email click-through rate was 4.12% for smartphones, 3.12% for tablets, and only 2.39% for computers. Even with those numbers readily available, believe it or not, not every company has optimized their email marketing for mobile. And mobile optimization is a critical component of your email marketing strategy — according to Comscore, 70 million Americans utilize mobile email, and 43% of those users are checking their email on their mobiles four or more times per day. If you were listening to the webinar, you remember Dan pointing out that 80% of users in his data indicated they utilize mobile email. If you want to increase your mobile click-through rate, make sure every email you’re sending out is optimized for mobile, because your audience IS there.
6) What is considered a decent unsubscribe rate?
The short answer? Under 1%. Aside from being generally trustworthy, you’ll be able to achieve this rate by only using opt-in email lists, properly segmenting your lists, only sending relevant content at an appropriate sending frequency, and religiously honoring unsubscribes. Not sure if your practices are considered trustworthy? There’s a free service by Return Path called Sender Score that will tell you.
And remember, unsubscribes aren’t all bad! When recipients do unsubscribe, consider it a natural list cleanse — after all, for the health of your Sender Score, you don’t want to be sending emails to those who don’t want to receive them. If your unsubscribe rate stays under 1%, don’t let those disinterested few get you down!
7) When should I be sending my emails?
There are really three frequency measurements you should consider, backed up by data from HubSpot’s Dan Zarrella:
- Time of Day – Dan Zarrella’s data showed emails sent at 6 AM had the highest click-through rate. Emails sent from 10 AM-12 PM showed another small spike, and the later the time in the evening, the higher the click-through rate climbed.
- Day of the Week - Experimenting with weekend emails could benefit your business, perhaps due to the lack of other competing emails coming through on Saturday and Sunday. Zarrella’s data showed the lowest click-through rate (and highest unsubscribe rate!) occurred on Tuesday, with Wednesday and Friday coming in as the weekdays with the highest click-through rates.
- Number of Days Per Week - Zarrella’s data showed that click-through rates decrease the more emails you send in a given campaign, so our best practice here is to “chill” a bit. Space your emails out so your subscribers don’t feel bombarded. We’ve also written a blog post that outlines the steps you can take to perform your own email sending frequency test — check it out!
8) If there are multiple links in your email copy, how do you balance the attention you give to each of them?
It’s never a bad idea to include multiple links in an email, since each link is a call-to-action that could reconvert your email recipient. That said, you don’t want those calls-to-action to compete with one another, which is why it’s crucial that you decide exactly what it is you want your email recipient to do upon receiving your email. That way, none of the links are competing with one another for attention — they’re all contributing to the same goal! For example, when we send an email, we have multiple links contained therein, but they all help us reach our end goal of reconverting recipients and moving them further down the sales funnel:
The first two links called out in orange both lead to the same landing page, but they are accompanied further down the email with a second call-to-action that encourages the recipient to reconvert on more bottom-of-the-funnel offer — a free trial of our marketing software. Both of these links work together to help us meet our reconversion goals, so however our recipients choose to interact with this content, we’re happy campers!
9) What is the difference between a paid and organic email contact? Is one better than the other?
An “organic” email contact is someone who chose to subscribe to emails from your company by clicking on a subscribe button on your website, filling out a form on one of your landing pages, or otherwise indicating their willingness to receive your emails. A paid email contact is someone who didn’t indicate they wanted to receive email from you, but whose contact information you purchased.
As for whether one is better than the other … yes, organic email contacts are way better than those you pay for. We’ve written an entire blog post on this subject, but in a nutshell, those who opt in to receive your email communications are more interested in your company than those you pay for, are less likely to mark you as SPAM, will unsubscribe at a far lower rate, and as such, your email deliverability rates and sender reputation won’t get totally annihilated.
10) Any tips on how to comply with the CAN-SPAM Act?
Definitely. There are four main points that are important for email marketers to remain compliant with CAN-SPAM:
- Don’t use misleading, deceptive, or falsified information in your ‘From,’ ‘To,’ or ‘Reply-to’ fields, email subject line, or routing information. Clearly identify who the email is coming from — be it your company or a specific employee within your organization — and make sure your subject line accurately describes the contents of the email.
- Include your company’s physical address in every single email you send out. These are typically placed in an email’s footer.
- Include an easy to find unsubscribe link in every single email, and make sure to honor unsubscribes promptly and completely. “Promptly” is defined as within ten business days (but try to be speedier than that), and “completely” means that you do not sell or transfer their email address to any other lists after the unsubscribe is complete.
- Make sure the Email Service Provider (ESP) you’re using is reputable. If something illegal does go down with your emails, both your ESP and your company can be held responsible.
Check out our blog post on the laws marketers need to know to avoid legal backlash if you want to learn more about CAN-SPAM.
11) What are marketing automation tools? Is there a particular marketing automation tool that HubSpot recommends?
When referring to email marketing, marketing automation tools help marketing departments like yours carry out automated email campaigns efficiently. They can integrate with your CRM to track the actions and behaviors of your leads, and can launch email campaigns based on these behaviors or other triggers you set, such as download history or other forms of lead intelligence. Marketing automation tools are critical to carrying out your lead-nurturing campaigns, and they can also help you monitor the performance of your email campaigns so you can continue to make adjustments that improve your performance. As to whether we recommend any marketing automation tools, well, we’re obviously partial to our own marketing automation software
What email marketing questions are still on your mind?
Image credit: bilal-kamoon
I’ll be the first to admit that when I receive revenue stats like “95 cents per email sent,” an enterprising part of me thinks, “If I send more email, I’ll make more money!”
The good news: This could be true. The bad news: Sending more email could be destructive to the precious reputation you’ve built — and to your revenue (even if it takes a while to take effect).
So do we give up that extra revenue for fear of losing it all? Do we simply ignore those of our customers who are willing to convert more frequently to appease those who don’t want more email? Do we gamble with our revenue?
With today’s technology, absolutely not.
Look at cadence as an opportunity to be even more respectful (i.e., responsive) to each customer’s desires. Begin to learn about how many emails your customers want and when they want emails (and what they want in those emails).
This is harder than it sounds. Your list is not populated by uniform, static automatons who respond with machine efficiency to your marketing efforts. Each subscriber has different needs during different seasons and during different stages in their life. And these needs are different than the next subscriber on the list.
The sophisticated email marketer should be able to vary cadence — frequency and timing — based on users’ preferences. The sophisticated email marketer can also find at least three ways (maybe more) to discern these preferences and assign different subscribers to different lists.
Method 1: Ask them directly
As part of an enhanced welcome program (or whenever you choose), poll your subscribers on how they’d like to receive emails. Start with just one parameter, such as the stage of their life. For example, you could send a poll asking:
Which best describes your current living situation?
a) Renting an apartment
b) Looking to buy a house
c) Own a house
d) Transitioning, looking for a new place to live
e) Wandering or homeless at the moment
If you were a home retailer, you could use this information to alter frequency and timing appropriately (not to mention content). Test it with 20 percent of your list, and see if you get a lift in your KPIs over standard promotional campaigns.
Method 2: Observe and assume
Separate your list into a few groups and experiment. For example, if you currently mail all of your list three to four times a month, make three groups. For group A, do what you’ve always done. For group B, send just two emails a month. For Group C, send six times a month.
Alter your strategy based on your intuitive knowledge of your lists, but at least try more frequently and less frequently (if you’re testing for frequency only). You can do more groups if you like, and you don’t have to put an equal amount into each group (e.g., five test groups could have only 10 percent of your list if you’re worried).
See what happens. Really get into the data and see if certain subscribers opened more or converted more when they got less email. Assign them to get less emails moving forward. If some subscribers didn’t open more despite receiving more emails, they shouldn’t continue to receive more frequently.
Remember to control your tests for one variable. You shouldn’t be testing new content in one of the groups that is also receiving email more frequently.
Also be sure to give these tests enough time to run. In the example above, I’d say a month would be sufficient, depending on the frequency. You should probably get 10 emails out there in your control group to make good inferences.
Method 3: Look and guess
If you’re having deliverability issues, or for whatever reason you can’t wait to hear back from your subscribers on what they’d like to see cadence-wise, all is not lost.
Build some segments from the data you already have. Build a group of subscribers that open 25 percent or less of the emails you send. See what happens if you email that group half as frequently.
Build a group of subscribers that convert often. Try to email them more often, and see if that increases. Take small steps, and be sure to measure the difference.
Ideally, you’ll identify a group (e.g., subscribers who open 25 percent or less of the emails sent in the last year) and, from that group, build a control group and a test group that will get the adjusted frequency. See how the stats compare.
This method is not as good as the first two, but it can help.
Don’t treat everyone the same
Your subscribers want to receive a different amount of email from you at different times in their lives and at different times of the year. The more responsive your program is to those desires, the more relevant you become, and the better your campaigns will perform.
If you’re wondering whether you should send email less or more frequently, test it. There’s a good chance your email service provider offers the functionality, and perhaps even the services, to help get you started or manage the process entirely.
On Twitter? Follow iMedia Connection at @iMediaTweet.
“Seamless at email sign” image via Shutterstock.
Each year, we go through the same process, don’t we?
We start the year with a topic, a theme, something that carries us through the Spring. Some of you find the topic interesting, and hire me. Others find the topic interesting, and implement the ideas. Still others find the topic of interest, and continue to subscribe. Finally, there are those who don’t like the topic, and unsubscribe, citing “too many updates”!
Then we get to Memorial Day.
From late May to early September, attention is diverted. Some might think content is to blame, but I beg to differ. No, I think our attention span dwindles, in part, because of the weather.
For the seventh consecutive year, you’ll be part of a time-honored tradition, called “a decrease in content frequency“.
Starting next week, posts will be published for public consumption on Monday morning, Tuesday morning, and Thursday morning. As always, when topics dictate, supplemental posts will be published.
I have more conflicts than a Louisiana politician when it comes to the news of the New Orleans Times-Picayune reducing its frequency from seven to three days a week: I was in charge of digital content in the parent-company division that started its sister site, NOLA.com; I worked on Advance’s Ann Arbor project; I was involved in the early stage of its Michigan project; and I’m working with Advance on another effort — though I am privy to nothing about New Orleans today. So take anything I say with a grain of salt the size of the Gulf of Mexico…. Still, I can’t not comment on the news.
Mathew Ingram and Ken Doctor will take you through the economic reality at work in New Orleans and Advance’s Alabama and Michigan markets: The cost of printing seven days a week is becoming unsustainable. It’s still profitable to print two or three days a week, not because those are the only days when news happens but because newspapers are still in the distribution business and those are the most lucrative — still-lucrative — days to distribute inserted and printed ads.
That could change again when and if (a) newspaper circulation falls below the critical mass needed to distribute coupons and circulars and (b) local advertisers become more savvy and finally move online themselves. Then printing and distributing paper will become even less profitable, even less sustainable. That’s when print could — mind you, I didn’t say “will” as I’m not predicting the form’s demise; I repeat, “could” — disappear.
By then, newspapers had better be ready. That is, they had better have become digital companies. That is the essence of the digital first strategy: become sustainable, successful online companies that can survive without (or with) print. And grow again from there.
That’s the process we’re witnessing here — that and a continuing cutback brought on by falling circulation and advertising revenue; not a new story, of course. This is a most difficult transition.
Guardian editor-in-chief Alan Rusbridger has been talking about this transition for years. Back in 2005, he talked about buying the last presses. Later, he talked about trying to move his newspaper over what he called the green blob — the great unknown that stands between declining print and ascending digital. That is the job of the editor and publisher today: to make that transition. Shifting content, staff, readers, and advertisers from print to digital is necessary. Improving digital is necessary. And rethinking print is necessary.
If profitable, I think there could continue to be a role for print. In the Guardian’s case, I’d propose that it follow the very successful model of Die Zeit in Germany and publish once a week as the Weekend Observer, turning the Guardian into an online-only, worldwide brand, which it pretty much already is. See, I’m not against print.
But we have to make print beside the point. Of course, it’s not the manufacturing and distribution we should care about preserving and advancing. It’s the journalism and service. It’s not the past we want to protect. It’s the future.
You can argue with the strategy undertaken by any newspaper company undergoing this difficult transition. But better a transition than the alternative.
LATER: Postmedia in Canada just announced that it, too, is cutting frequency, ending Sunday papers (which are thin like Saturday papers in the U.S.) in Calgary, Edmonton, and Ottawa. The National Post is suspending Mondays in the summer and looking at its schedule. The company is moving page production to a shared facility in Hamilton, Ont. Disclosure: I’m on the digital advisory board for Postmedia.