We Can Fix a Leaky Digital Branding System in 2014

By Chip Meyer, CEO of Reactx

After years of grumbling, industry-wide complaints about wasted brand spending and missed opportunities for publisher revenue have reached a fever pitch. Earlier in February, Netscape co-founder turned big-time VC investor, Marc Andreessen, let loose on Twitter about how tech vendors are continuing to drop the ball in digital branding, and how publishers are failing to hold ad tech companies to a higher standard. “The issue,” he tweeted, “is that most ad tech is optimizing against the local maxima of price, rather than consumer relevancy.” Immediately a chorus of notable voices from both the vendor and publisher sides chimed in to support Andreessen’s sentiments (there’s certainly something to it).

Andreessen’s argument was simple: We see too many poorly-targeted ads with poor ad content, which does no one any favors. The core problems he hinted at go fairly deep. For brands and publishers to thrive in a digital setting, we need to see a more advanced alignment of brands ads and publisher content. Tech companies need to take advantage of media that delivers highly relevant, targeted ads across quality content, to make programmatic the valuable digital branding tool its proponents have long claimed it could be. And campaign performance must be measured in a way that makes sense for the way 21st century consumers interact with content and ads, rather than concentrating on click-throughs or assuming every impression is worth the same as any other impression delivered to the same targeted consumer.

It’s a tall order, but that’s what brands and publishers want, regardless of whether or not they know how to ask for all of it right now. They will soon enough, and there are vendors in the market right now who are happy to deliver the goods.

Brand marketers also demand rich content — like page take-overs, overlays, skins, peels, IAB Rising Starts, rich video and a variety of other dynamic digital ad creative that is needed for the depth and rich feel branding necessitates. However, for programmatic to serve these high impact, custom premium ads that brand needs to move the needle for branding online, the industry at large needs to get over the old idea of having to pre-qualify them (i.e. pre-determining that custom digital ad format will render properly on a specific publisher’s domain or page, before it is served).

Pre-qualifying custom ads is holding up the delivery of billions of dollars in high-quality, engaging digital ad content served via Real Time Bidding (RTB). The process of pre-qualifying digital ads is fraught with costly RFPs and custom one-off creative development for individual publisher domains and pages. It’s effectively neutering the speed and scale advantages of programmatic and impression level targeting, turning the delivery of custom ads into a time-consuming, clumsy and expensive process that is not adaptive to cross-device designs and auto-customization of creative and brand messaging. Pre-qualifying ad is a poor fit for the highly competitive, real-time digital environment. By eliminating pre-qualification for custom ads, brand marketers and publishers will save nearly $5 billion in what is now simply wasted spending — money they should be spending with publishers instead of on coders.

Let’s look at measurement. Andreessen hinted at something very important here: Not all impressions are created equal, even when they’re delivered to the same consumer. What matters is how the consumer engages with the ad, and the content around the ad, where it’s served. To that end, we need to look at new metrics for ad engagement, like time spent with the ad. If the consumer spends more time with the content around an ad, that’s a more valuable placement than one where the consumer quickly clicks away. Understanding time spent with the ad can be a great boon for publishers to entice brand marketers to shift their spending from television to digital. For marketers, taking advantage of metrics around time spent with the ad is simply good branding.

The seeds have been planted for an industry-wide move toward more premium, high impact (relevant) ad placements and metrics. In the coming year, an increasing number of solutions will be suggested for the problems of garbage ads and wasted spending. But for change to happen, brands and publishers need to understand what smart branding and smart measurement mean today. Whatever happens, buyers and sellers alike will have to adjust their mentality — and the tools they use — to account for the value of branding in a digital environment.

10 Reasons Why Your Agency Should Offer Optimisation

By Dan Glazer, Head of Partnerships at website optimisation software company Optimizely 

The marketing game is changing. Today’s marketers are more data-driven. They’re obsessed with results and ROI. And they’re laser-focused on the one thing that helps them deliver both – technology. By 2017, it’s predicted that CMOs will spend more on IT than their CIO counterparts.

For digital agencies, this burgeoning landscape of data-driven marketers presents an exciting challenge. New technologies and best practices are emerging every day. How does a digital agency looking to be best in class in its industry manage a finite budget and deliver impactful results?

Answer: Conversion Rate Optimisation (CRO).

Whether you specialise in designing creative, building websites, or managing ad spend, investing in a CRO skill set will help you grow your agency… and your clients’ business. Here’s how.

1. Lead the charge.

Ten years ago, most marketers didn’t know what SEM was. These days, SEM is not only part of the everyday marketing vernacular; it’s an integral component to any online marketing strategy. That didn’t happen overnight. Recognised leaders in the space started optimising for search in the mid-nineties, establishing best practices and a framework for others to follow. By the time the rest of the world hopped aboard the SEM train, these companies had already established themselves as thought leaders and trusted advisors. CRO adoption is happening even faster than SEM, but it’s still early in the game. Who will be there to build out an optimisation strategy and coach brands through their new testing playbook?

Enter you (and your agency). Don’t stand on the sidelines as your competitor transforms the way your clients do business. Be the agency that introduces your client to their next home run.

2. Turn one-off client projects into long-term engagements.

Unlike building a website or creating a new ad spot, CRO isn’t a finite deliverable. Continuous testing leads to incremental growth, test after test, and can strengthen each one of your clients’ digital marketing efforts.

Building them a new website? Don’t just “set it and forget it.” Lengthen the engagement by monitoring and improving the site over time. Setting up a new media buy? Don’t let it run wild. Deliver long-term results by testing various landing pages and reallocating traffic to the winning page as results come in. Unless you’re converting 100% of visitors today (and if so, please let me hire you!), there’s always room for improvement and an opportunity to deliver better results.

3. Settle debates with hard numbers.

Testing is one of the first steps toward becoming a more data-driven organisation, which is especially vital to a successful client-agency relationship. Have you ever disagreed with a client as to what that headline should say or what image to use? A/B testing allows you to objectively–and quickly–solve that argument and move on. No time wasted with back-and-forth creative treatments, no tiptoeing around big egos, just empirical evidence. In the end, the best experience wins.

4. Use data as preventative care.

With CRO, you can easily monitor site performance and prevent problems for your clients before they arise. Imagine you hire an agency to redesign your website. You spend months wireframing with the agency, discussing the brand, rethinking the vision, and developing the designs. Then, on the day the site goes live, the unthinkable happens – revenue drops. Traffic is constant, but the conversion rate is lower. That’s not exactly the start of a beautiful friendship.

Testing helps ensure the new redesign goes well–an outcome just as important to your agency as the CMO that hired you.

5. Use data as insurance.

Testing not only increases your chances of success, it also keeps you in the game, even when results from tests don’t turn out as expected. Back to the failed website redesign example… Investing so much time money and other valuable resources on a project that turns out to be a complete flop is a scenario avoidable with testing. Had you been running A/B tests on the new design, you would have been able to look back and gain a much deeper understanding of what went wrong. Where did visitors drop off in the funnel? What types of users were more or less likely to convert? What parts of the site did they engage with or not engage with?

Data and analysis gained from testing provides a 360-degree view of website performance. It helps you explain why you did what you did and make the case for continued, more informed iterations. The more data behind your decisions, the more you and your client win.

6. Improve efficiency.

Testing saves you and your client time. Using data to back up decisions prevents unnecessary back-and-forths and helps you execute more quickly, ensuring you’re always delivering the most value per billable hour. Testing also saves money, ensuring efficient use of capital by preventing significant investment into projects (like that website redesign) destined to fail. Speaking of saving money, testing also helps you make more of it.

7. Turn more ad clicks into conversions.

If your clients are spending on ads, you should be doing everything in your power to maximise the chance of conversion – that’s where CRO comes in. Testing pages you send paid traffic to is a proven way to improve conversions and increase your ROI from SEM. Liftopia, for example, increased revenue from SEM traffic by 23%, simply by optimising its landing pages. Focus solely on the top of the funnel is inefficient if you don’t improve the bottom as well.

8. Open a brand new revenue source for your agency and your clients.

Adding CRO to your services offering creates a brand new revenue source for your agency that will allow you to increase conversions for your clients. Increased conversions lead to more revenue for your clients. More revenue for your clients makes your service more valuable. The more value you can create, the more likely it is that you’ll continue to win deep follow-up engagements with existing and new clients. Win for you and your client.

9. Increase recurring revenue for you and your clients.

We’ve said that increasing website conversion rates is a practice that increases revenue. The noteworthy part about this type of revenue is that it’s recurring. Unlike online advertising, where each pound spent only brings as many visitors as you pay for, increasing conversion rates brings in recurring revenue gains that last even after the test stops.

Let’s say your team increases average revenue per visitor (ARPV) for your client by 17%. The new ARPV becomes the new status quo. Since CRO is an ongoing practice, you can constantly uncover conversion increases that increase the stream of recurring revenue for your clients. Do you job well, and you might guarantee yourself a contract to run CRO for your clients year after year.

10. Deliver massive ROI.

Pound for pound, website optimisation offers the highest ROI of any marketing activity. By optimising visitors’ experience of your clients’ brands, you can dramatically improve ROI for your client and enhance the credibility of your agency.


Do you know what you’re working with?

By Steve Denner, co-founder and director Adestra

Technology is well and truly in the hands of the masses – almost 80% of UK consumers own a smartphone. I recently read that in 1991 the cost of buying all the iPhone components would have been around £2million – but are we really making the most of all the computing power now available at our fingertips?

Remaining up to date with the latest technology can be difficult. As soon as you’ve got to grips with the capabilities of the latest tablet or games console, the next model is already being prepared for release. This game of cat and mouse also applies to marketers, who need to maximise the technologies available to them, meeting (and even predicting) customers’ needs.

But as businesses and consumers rush ahead, are we missing out on anything, from handy user shortcuts to deep-rooted capabilities? The promise of automation carries the same risk for marketers looking to simplify their strategies at speed. Once businesses have invested in marketing automation programmes, they need to ask: are we getting the most out the increased functionality at our disposal?

There’s a danger that business leaders could adopt an attitude along the lines of: “we might not use all the functionality we’ve invested in just now, but it’s good to know we’ve got it”. This increasingly common sentiment proves that marketers aren’t making the most of the functions they’ve invested in, and are wasting money in the process. This could be the result of a lack of knowledge about how to use the technology – or an overestimation of the technology companies actually need. Both represent a waste of resources for marketers.

I don’t mean to suggest that marketers are at fault here, but simply that technology is evolving so rapidly that there isn’t always sufficient time to adequately train marketers to maximise the potential of new technologies. In their bid to keep abreast of consumer expectations and technological developments, businesses might also be investing in automation systems without accurately evaluating their brand’s actual requirements. Without understanding basic brand needs and objectives, technology will never be able to fulfil them.

To make the most of any automation investment, marketers must make sure they have the right people in place to implement and maintain their systems. If not, the next stage is to skill-up, skill-shift or recruit new talent into the workforce. These are implications that technology vendors should be able to advise and help clients with. It’s risky and, quite frankly, inaccurate to assume that one person alone is capable of developing successful and profitable large-scale automated marketing campaigns – even with the help of powerful technology.

One way to ensure marketers fully understand and capitalise on the potential of a specific marketing automation system is to work with a partner who can offer valuable on-going support and guidance. The benefits of working with third-party customer service experts include real-time support, training, project management or access to specialist designers. On-site and third party expertise provides much needed backup for automated campaigns.

If marketers are armed with powerful technology, given expert insight and training and have access to tech support, they have a far better chance of winning the battle to drive customer loyalty, and unleashing the full potential of automated marketing strategies and campaigns.

Open Source: From Great Technology to Greater Intelligence

By Leon Ward, Product Manager, Advanced Malware Protection: Network, Cisco Security Group

The financial services industry has embraced the adoption and use of open source software and according to software and consulting firm Black Duck, up to 75% of the code supporting a UK investment bank’s trading application is commonly based on free and open-source software. Only 18% of the code is proprietary, it says. And analysts say that adoption in financial services is poised to increase further as cost pressures grow.

In turning to open source, the financial services sector is following a path trodden by other regulated industries – healthcare and government IT, for example – which are attracted to open software development models by promises of cost control and increased innovation.

The origins of Open Source can be traced back to the software developer community that evolved around the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology (MIT) during the 1960s and 1970s. In those early days, all software was shared freely amongst the academics and enthusiasts who wanted to build great software to address new challenges.  As technology adoption spread in the 90s, interest in the ‘open’ approach continued to grow as users also recognised the value side of the equation. Not only were they gaining access to software that had the benefit of a community of engaged and interested minds working together to continuously improve it, but open source saved costs by opening the market for support and maintenance of the code. As corporate networks expanded another benefit emerged. Open source enabled agility.  Organisations could more easily integrate complementary applications and services into their environments to respond to new business imperatives and expand capabilities for their users.

More recently, in the context of cybersecurity, open source is a very effective way to solve complex problems because it creates real collaboration and trust between vendors and the experts that are tasked with addressing advanced and aggressive IT security threats.

Modern corporate networks extend beyond the traditional perimeter to include data centres, endpoints, virtual, mobile and the cloud. These networks and their components constantly evolve and spawn new attack vectors including: mobile devices, web-enabled and mobile applications, hypervisors, social media, web browsers and home computers. Attackers are taking advantage of gaps in protection to accomplish their mission. They also go to great lengths to remain undetected, using technologies and methods that result in nearly imperceptible indicators of compromise.

Open source is a valuable tool for defenders as they work to close these gaps and to gather greater intelligence about potential threats to make better decisions and take action. Let’s take a closer look at the role of open source in these two areas.

Closing security gaps. Reducing the attack surface is essential as organisations strive to protect against the latest sophisticated threats. Waiting for updates from vendors to close vulnerabilities isn’t realistic when high-value assets are at stake and attacks are relentless. For organisations creating their own custom applications, the ability to detect and protect these applications is even more challenging. An open approach can help organisations close security gaps faster with the ability to create protections on their own or apply shared best practices and tools.

Gaining greater intelligence. To deal with dynamic environments organisations need access to global intelligence, with the right context, to identify vulnerabilities and take immediate action. An open architecture facilitates the sharing of real-time threat intelligence and protections across a vast community of users for collective immunity. It also streamlines integration with other layers of security defences added as IT environments and business requirements change, thus enabling more effective, coordinated protection.

In the realm of technology, open source has a long history and its applications and benefits will continue to evolve and grow.  The findings of the 2013 Future of Open Source Survey state that increasingly enterprises across the board see open source as leading innovation, delivering higher quality and business driving growth. Based on the tenets of community, collaboration and trust, it is an approach that delivers stronger solutions, addresses complex problems and demonstrates technical excellence, innovation and dependability.

LUSH uses Data to Improve In-Store Performance

By Claire West, Fresh Business Thinking 

Lush has made data analytics available to staff on the shop floor as well as in its warehouses, so they have real-time information on sales performance at their fingertips.

Not only has this helped to tap into the ambitious spirit of staff – competing over which store can do best in terms of sales and performance – but it also gives them information to make decisions and better the customer experience.

For example, if they notice a particular bath bomb is selling well with a certain shampoo, they can change the store layout so the items are closer together.

As all products are created from fresh ingredients, the retailer has actually been able to achieve savings of £1 million through ensuring it is making use of it is produce in the most effective way – only ordering exactly what it needs to create the right amount of goods for the levels being sold and ensuring no products go out of date.

Technology platform, QlikView Business Discovery, is being used by employees at every level throughout the business to provide access to relevant sales, stock, store and staff information to improve performance.

Scott Silverthorn, Head of  Data Services at LUSH Fresh Handmade Cosmetics. “As it is used by many different parts of the business, QlikView allows us to continually make insights into our company. Some of the shop managers have told us they have had their most profitable year ever because QlikView has brought together the data they need to manage their sales, their stock and their staffing.”

Testing Should not be Testing – 4 Ways to get the Most from your Optimisation Team

By Matt Althauser, European GM of Optimizely

It’s a phrase that’s turned into cliché. “Curiosity killed the cat” itself is a common warning about the dangers of unnecessary experimentation and the perils of the unknown. Its rejoinder is less well-known, though – the full quote is often heard as, “Curiosity killed the cat, but satisfaction brought it back.” For business leaders and marketers looking to improve their websites’ conversion rates, the extended phrase holds true.

Continue reading

London’s Wearable Tech Show 2014

By Rupert Cook, Business Development Director at Gekko

Last week saw London’s Olympia host the UK’s first ever Wearable Technology Conference and Expo dedicated to showcasing the latest developments in smartwatches, wristbands and other wearable devices. With speakers from Microsoft, Google, Samsung and Intel, the show promised a lot, but did it live up to the hype?

Continue reading

Fighting Virtual Shadows to Protect Customer Data from Malicious Intent

Wieland Alge, Vice President and General Manager EMEA, Barracuda Networks

The plight of Barclays Bank, following the theft of thousands of confidential customer files, has once again thrust the issue of how organisations protect confidential data high up the business and consumer agenda.  Accountable heads are lifting from the global sands of ignorance as theoretical threats become real life scenarios that must be dealt with or expose data vulnerabilities which could see the downfall of even the most powerful brands.

Continue reading

A New Dawn for the Internet?

Sam Silverwood-Cope of Intelligent Positioning

“This massive expansion (of Top Level Domains) represents one of the greatest changes to the Internet since its inception.” said Akram Atallah, President of ICANN’s Global Domains Division, the group behind the release of multiple new Top Level Domains. It is aimed to give more choice, greater freedom to brands and open up an already saturated market of highly fought after domains.

Continue reading