For Retailers, “Insourcing” Adtech is the New Black

By Declan Kennedy, CEO StitcherAds

Insourcing is not a brand new concept. But it’s taken a long time – too long – to reach the part of the enterprise most critical to its success: the part that acquires new customers and grows revenue. Fortunately that’s now changing.

Not so long ago, outsourcing was all the rage when reducing costs. But more recently, enterprises have found that the need to respond quickly and change frequently – tying together information from many silos of the business – it’s more effective and more secure to bring what were once exotic specialities in-house rather than leaving it to a third party.

For example, we saw UK retailer Marks & Spencer move e-commerce systems from Amazon Web Services back in-house. Now appropriate and more mature applications are available to support this, many businesses are taking an intelligent and crucial step towards ensuring ROI on many processes like website development, SEO, and most recently marketing automation. Savvy digital marketers will anticipate that with the right tools, they can get the results they need without losing crucial time spent communicating with third parties.

This movement has been brought about by the collision of two big trends: the SaaS revolution, which has completely changed expectations of software those in enterprises and SMBs alike; and Big Data, where the variety, velocity and volume of information and in particular customer interactions can no longer be left to others. No longer are radically changing and arbitrary costs the norm for a service, instead people can bring processes in-house with the support of an effective SaaS platform, for a steady and fixed price. This has been an almost universal shift within business with but one exception because of its traditional moorings – adtech. But this cannot last.

For retailers, in particular, where the skills of using direct response advertising to capture opportunities in near-real-time and win new customers by rapidly shifting offers to reflect changing demand are crucial, the adoption of effective marketing automation tools is paramount. In a recent report from eMarketer, analysis shows US retail digital ad spending will reach $11.05 billion in 2014 shows, with a 70% emphasis on ads meant to trigger sales and leads than those intended to boost brand awareness – with increasing amounts being spent on direct response, the most value is going to come for retailers who need to make every penny count.

Agencies still have a role to play – their strategic insights are often extremely valuable, but creating a lot of advertisements rapidly isn’t something that makes sense to price based on a percentage of spend (CPM). Businesses are starting to realise Facebook has become an incredibly powerful channel for direct response advertising, and with the right technology they can maximise sales conversion rates, whilst keeping costs low and predictable. Adtech is by no means as mature as it should be, but in-house marketing teams, vendors and agencies alike are starting to wake up to its possibilities.

0 votes, 0.00 avg. rating (0% score)

Technology and Culture: A Chicken and Egg Debate

By Joseph Do, CEO MindLink

Embracing digital technology is not just a business necessity, but also a smart cultural move. The evolution of technology has invariably prompted cultural change within organisations across the globe, but should it act as the primary driver for it?

Whilst technology can act as a trigger, it is an organisation’s culture that has the power to nurture forward-thinking tech adoption and ultimately improve long-term business growth.

The global business proliferation for technology that bridges the gap between social and enterprise means organisations are being forced to evolve to meet the changing technological needs of their workforce.

For example, The Bring Your Own Device (BYOD) trend is making a whole host of additional communication and technologies available in the workplace. Research shows that by 2016, 1.62 billion mobile devices will be in use on a daily basis in the workplace. It is because of findings like this that organisations need to be open to change and willing to embrace the forward-thinking nature of employees and the culture of business.

It is down to business leaders to take an open approach to new technology and consider how it will impact the people within their organisation. Just because a company has a progressive, corporate culture doesn’t necessarily mean employees understand how best to utilise digital tools. People will have to want to learn to yield and progress, so training remains essential if culture and technology are going to work together and have a real impact on company performance and revenue.

The spread of collaboration tools has cemented online communication as a completely new mode of operation for employees. Businesses should be looking to invest in smart collaboration tools instead of viewing them as an expense.

So first things first, what are the key components of a tech-friendly corporate culture?

Support

Without sufficient investment in staff development and training, technology will never reach its full potential within a business. Providing the right level of technical support and offering training to employees shows a company’s commitment and sense of support. It is important to go beyond just making tools available to a workforce and hoping people will naturally adopt them.

Open

The future success of a business relies on being open to new ideas and embracing new processes and structures rather than being resistant to them. It is up to business leaders to have an outlook that not only accepts, but also understands, how technology can benefit their business. No good will come from holding onto antiquated communication, leadership, management or sales structures in the hyper-connected, modern business environment.

Trust

As with most relationships, trust is key. Businesses must trust their employees if they want to capitalise on the potential of a fully mobile, collaborative workforce. Many people would agree that you can’t fight the BYOD culture (and in effect any digital culture) and why should businesses want to? This ‘always on’ way of working is an advantage for an organisation if it ensures precise guidelines for usage and compliance are laid out and sufficient security policies are in place. Establishing communication as well as knowledge sharing platforms will enable staff to work more efficiently and in a more connected way, whilst also helping businesses to benchmark the productivity of employees no matter where they are.

Collaborative

People within the organisation need to be open about sharing knowledge and insight in order to benefit the wider company rather than their own personal development. This means the mentality of a workforce is vital to maximising any new technology, especially when it comes to collaboration tools and software. A collaborative culture is needed before any new technologies are introduced while everyone needs to be working towards the same goals. If this collective attitude is commonplace within an organisation then there will be a natural fit when collaborative technologies are introduced.

There is no question that technology impacts working practices, but a receptive and open corporate culture must exist for technology to have a positive effect on business processes – rather than a disruptive or, worse still, damaging impact on performance.

Cultivating the right attitude at every level of your business is the best place to start building the solid foundations that will support technological innovation. There should be a symbiotic relationship between technology and corporate culture, as neither technology nor culture can succeed in improving a business’s performance in isolation.

0 votes, 0.00 avg. rating (0% score)

The Holy Grail – BYOD, COPE or CYOD

Choosing the right enterprise mobility strategy for your business

By Alessandro Porro, VP of international sales, Ipswitch

Finding the ideal alignment and balance between hardware, software and employee preference has become the holy-grail for those tasked with defining enterprise mobility strategy. BYOD delivered many great things, such as higher employee productivity and satisfaction. It also made IT managers rethink their strategies to make technology work for their organisation in terms of mobility, security and management. Then COPE (Corporate Owned, Personally Enabled) came along, which promised to solve some of the problems that BYOD didn’t, such as security. However, COPE also posed challenges and is being followed now by CYOD (Choose Your Own Device).

With so many acronyms flying about it might appear hard to know where to start identifying the best solution. However it would seem 2014 has heralded the end for BYOD with a recent report by analyst firm Gartner declaring its demise, stating: “There is no way for IT to assume full responsibility of securing and managing devices without ownership”.  Indeed, the acronym is now being translated by some as “Bring Your Own Disaster” suggesting it would perhaps be wise to learn from others mistakes.

The COPE model allows employees the choice of a selection of ‘company approved’ devices instead of using their personally owned device for work. This idea might sound like the days of the corporate Blackberry chosen by the IT department, but the ‘Personally Enabled’ part is where the IT department has released some control.

The COPE model solves some of the security concerns that BYOD generated, making it easier for IT managers to monitor and protect the devices, whilst still embracing the Consumerisation of IT by enabling users an element of choice. However, COPE is not without challenges which is where CYOD has emerged, offering an apparent ‘happy medium’.

A variation on COPE, CYOD lets employees choose from a limited selection of approved, corporate-liable device models with the levels of security and control that IT needs. The slight difference is that the employee pays for the upfront cost of the hardware while the business owns the SIM and contract for greater visibility, control and potentially lower costs.

Protecting your company’s data

When a company adopts a COPE strategy, supplying employees with ‘company approved’ devices, it is easier for the IT manager to ensure the protection of corporate data. As the company owns the devices, the IT manager can easily decide which data employees can and cannot access, make regular backups, and remotely wipe devices in case they get stolen or lost. Also, when employees have technical problems, those problems can be solved in-company, instead of at an external – and possibly dodgy – IT repair shop. All these measures reduce the risk of data falling into the wrong hands.

Privacy issues

The principal challenge though with COPE is privacy. IT managers must think critically about the consequences of their data protection policies. As the boundaries between work and private life fade, it will be hard to force employees to use their mobile device for business purposes only. But when employees use their devices privately as well, how far can the IT department go controlling those devices without undermining the privacy of employees? For example, is it still acceptable to wipe devices remotely if they contain private data, such as family photos?

Monitoring network traffic

Moreover, with COPE and with BYOD, IT managers are challenged by the introduction of multiple devices onto their wireless networks. As wireless becomes the primary user network, it needs to deliver the availability and performance that employees expected from the wired network. BYOD increased network density, bandwidth consumption and security risks. These issues will be reduced when IT managers decide to go for a COPE strategy, because the IT manager will recognise most of the devices on the network. He or she will be able to track users, their devices and usage habits in order to resolve any issues that could impact wireless availability and performance. 

Another major plus point of CYOD is that IT can focus on supporting a limited number of platforms and devices, rather than trying to support as many as possible.

Application control

But when implementing a CYOD scheme, organisations need to look at application control and whether CYOD should permit employees to run non-business-related applications. This is a discussion in itself when you start to look into controlling employees’ personal social media apps on corporate-owned devices. Certainly, many would argue that there needs to be a shift in focus away from standard MDM solutions and towards managing data and security at the app level.

A growing number of companies are opting for MAM (mobile application management) instead of MDM (mobile device management), since it enables IT to protect enterprise apps and corporate data throughout the mobile application lifecycle, from deployment to app signing to inspection for security flaws and malware.

Analyst firm Yankee Group predicts that the enterprise mobility market will consolidate, as organisations broaden their requirement for enterprise app development, on-premise and cloud-based deployment, app and device management, and security – all delivered by a single platform vendor. This would help organisations to achieve a holistic view of their enterprise thus enabling the management of devices, users, data and applications as well as delivery of cloud and on-premise deployments. In this vision the device an employee chooses to use becomes less critical as the focus shifts from the device to the app.

Making the call

What the evolution of BYOD to COPE to CYOD does best, for those struggling to decide which strategy makes most sense for their enterprise, is illustrate how fast things change. In turn that signposts a requirement to really look ahead and consider future needs and demands so that whatever strategy is deployed can be advanced and built upon with ease. This would involve consideration, at the outset, of solutions that can enable secure mobility, device choice, data consistency and agile management – that is where you should start… good luck!

0 votes, 0.00 avg. rating (0% score)

Count To Five And Keep Advanced Threats At Bay

By Sean Newman, Security Evangelist for Sourcefire, now part of Cisco

As business environments change, security infrastructure must change to enable business success. Whether you’re operating under increased risk from advanced targeted attacks, or transitioning to the cloud or mobile devices for the productivity, agility and efficiency these technologies provide, the end result is the same: You need to adapt your security infrastructure in lock-step. You can’t afford to leave gaps in protection for today’s sophisticated attackers exploit.

However, finding the resources to address the evolving cyber security landscape effectively can be challenging. Today’s attacks are stealthier than ever. To understand and protect against them, organisations need to mobilise all aspects of their defenses to focus on the threat, including services. It’s about gaining visibility and control across the extended network and the full attack continuum – before an attack happens, during the time it is in progress, and even after an attack may have been successful, with information stolen or systems damaged. This new threat-centric model is driving changes in cyber security technologies, products and services alike.

The first wave of managed security service providers (MSSPs) focused on getting products and tools up and running, maintenance, upgrades, and training. But today, effective cyber security services need to be based on an in-depth and continuously evolving knowledge of the threats themselves, not just the operations of the technology. Reflective of a new era in how we must address cyber security, some industry analysts are starting to call this next wave of security services MSSP 2.0.

Based on in-house security skills, budget, and competing business priorities you may choose to outsource more or less of your cyber security needs. Wherever you fall on the outsourcing spectrum, when evaluating managed security services, the following five questions can help ensure you get the support you need to stay focused on the threat:

1. What types of telemetry form the basis for your visibility and detection capabilities? 

If the answer is simply flow or log data, that isn’t enough. Other data, such as protocol metadata (i.e., data extracted directly from packets traversing the network) is a rich source of insights into today’s more popular attack methods like ‘watering hole’ attacks and phishing campaigns that contain links to malicious sites. In these cases, the ability to incorporate HTTP metadata in a telemetry model provides the depth of information needed to help detect web-based threats. With more data, the more effective the MSSP will be in zeroing-in on anomalies and that’s a key capability to finding the needle in the haystack.

2. How are you performing analytics on that data? 

With the inspection of more data, simple analytics models such as correlating logs against common rule-sets fall short, particularly if they do not function in real-time. Advanced, real-time, big data analytics techniques are essential to scrutinise the large amounts of data gathered, not just locally across the enterprise, but globally through community-based threat intelligence. This level of analysis isn’t based on rules that attackers can understand and hence evade, but is predictive and uses dynamic statistical modelling to identify anomalous behaviours from granular, customer network baselines and other indications of compromise (IoCs) to pinpoint likely malicious activities. Regardless of the number of telemetry sources used, applying robust analytics to data, rather than simple correlation, will result in high-fidelity detections.  

3. Where do you keep that data and how do you protect it?

You’ll need to understand if the data is held onsite, at the MSSP’s data centre, or in the cloud. Depending on the type of data your organisation has, the compliance requirements you face, and the guarantees the MSSP provides, you’ll need to decide if the answer is adequate and, if not, can they offer an alternative approach. This is an individual choice, for each organisation, and must be based on the comfort level of all parties affected from the technical, legal, and business sides of the organisation.

4. What do you report on?

Data is great, but you must be able to understand and act on it. You need a level of assurance that the data is correlated to provide context, so that the information you’re getting is relevant to your environment and has been prioritised. In this way you can focus on the threats that matter most. Time is of the essence when dealing with advanced targeted attacks that have a specific mission. Understand if the MSSP is able to present you with only vetted, high-fidelity, information, versus an endless list of events that require further analysis and investigation to determine whether they are true or false alerts.

5. How can you help protect my organisation against unknown, zero-day attacks?

To detect and protect against zero-day threats you need to be able to go beyond traditional point-in-time approaches with capabilities that let you monitor and analyse on an ongoing basis, across your extended network. That’s where the value of diverse telemetry, coupled with predictive analytics and statistical modelling, really becomes apparent. This moves beyond mere event correlation, that the MSSPs have offered for years.  In combination, these capabilities can pinpoint nearly imperceptible IoCs and anomalies to help identify these particularly stealthy and damaging attacks.

Given today’s business, regulatory, and cyber security challenges more and more organisations are looking for outside expert help, to protect their environments from cyber attacks. By asking these key questions, you can help ensure you’re MSSP is staying focused on the threats themselves in order to deliver the protection you need.

0 votes, 0.00 avg. rating (0% score)

Mobility Apps: What’s Next?

Ramesh Loganathan, VP Products at Progress

If Andy Warhol’s famous remarks about everyone having their ‘fifteen minutes of fame’ are to be believed, then how are we to view some of the more recent trends in mobile technology? The time spent in the spotlight by once-innovative concepts such as BYOD, remote working and citizen developers is beginning to ebb away as the hype is replaced by newer, more pragmatic and meaningful technologies. Today all these ideas that, although considered new and exciting at the time, are now fully entrenched in the workplace, and considered part and practice of every day working life.

What’s is clear is that the adoption of mobility in enterprise applications is gradually becoming part of the mainstream. One of the main reasons for this is the simplified access to enterprise business services and data through the cloud, which, in turn, is resulting in some innovative new application models. It’s a trend that is starting to emerge and will gain full steam later this year and the following five themes hint towards what we can expect to see.

1.     Increased ‘on the go’ productivity with larger data sets

Although the ability to be productive and work ‘on the go’ is hardly new, it’s worth bearing in mind that, to date, this has only seen widespread adoption using relatively small data sets. Now, Increasingly, we’ll see more widespread adoption of applications that rely on larger, more complex data sets, giving users the opportunity to work with relavant business solutions without any constraints on functionality or data accessible from mobiles..

2.     Mobile apps serving a user’s purpose, accessing multiple back-end solutions

It’s easy to think of data as a single entity, stored in one single location, which can be easily accessed. The truth, however, is very different, and quite often, employees find themselves having to switch between multiple data sets, stored in a variety of locations, and accessed via a diverse range of applications. If mobile productivity is to become truly effective for those that are reliant on these divergent data streams, it must surely become necessary to access many of these sources when building a mobile application. IN enterprise solutions use cases, the access maybe a bit involved needing the mobile apps having its own backend that mediates and aggregates access to these multiple enterprise backed solutions. By integrating multiple back-ends, and providing the ability to use them all through one single point of entry on a mobile device, businesses will be able to provide mobile productivity at a level never previously allowed.

3.     Mobile first user-centric applications

With most enterprise business processing and departmental IT solutions in place in all enterprises, the next set of solutions being considered by enterprises are primarily around user productivity. Around user centered applications that enable user’s processing. Which often needs access to multiple enterprise solutions. What’s new though is that now a new class of solutions are coming up leveraging all aspects mobile.

In days gone by, mobile-specific applications were developed almost as an afterthought. Typically, they were either created from a quick port of an existing desktop or web-based application, or developed afterwards with long, resource-intensive lead times. Today’s developers can no longer afford to think in this way. Now much emphasis is being placed on achieving the holy grail of productivity, that developers will increasingly develop for mobile platforms first before web and other platforms. Leverage the continuous access to these apps that users will have. Leverage access to location and other user’s context in enabling a better experience and utility for the users. Connect the user to any and all needed back end services and data to enable some processing that users want to achieve. Anywhere. Any time. From their mobile.

4.     More focus on the business than the consumer

One of the main causes of the BYOD revolution was that it allowed everyday people to access information they needed to access in their place of work in ways they would ordinarily access information in their private lives. The consumerisation of IT trend was based almost entirely on the idea of user experience, and there was little doubt that consumers were king in this scenario. Moving forward, we’ll see a reversal of this, with more and more organisations realising that it’s possible to target the deep pockets of businesses with the sort of user experience once typically associated with consumers. What’s clear is that as developers increasingly place business expectation over and above individual user preferences, this increasing consumerisation of applications will come hand-in-hand with a business end-user focus.

5.     Modifying applications ‘on the go’

Perhaps the biggest upshot of these new mobility trends is that we’ll see the act of developing applications moving away from being the sole preserve of dedicated coding experts. The users that access applications will themselves gain the ability to modify them. This will enable a class of solutions that will allow each end-user to easily put together an app that automates some business processing of their own, all accessed from their mobile. This will allow access to any backend business function or data that the user needs to perform operations. The resulting ability to use mobile devices as an application development tool, and not just as means to an end for an end-user could radically alter the way we think of mobile devices. There’s no doubt that this will give way to a new breed of ‘on the go citizen developer’ as non-tech end-users will build and modify their own apps.

These five suggestions may not be the be-all and end-all of developments in mobile over the coming months and years. However, as more and more businesses adopt mobile first strategies and move mobile applications further away from the domain of the consumer and more into the realm of businesses, there seems little doubt that we’ll see a fundamental shift in the way mobility is perceived and used. Will it be enough for these trends to have their own ‘fifteen minutes of fame’? Only time will tell!

0 votes, 0.00 avg. rating (0% score)

Online and Offline Technologies Converge onto a Single Platform

By Rob Garf, VP of Industry Strategy & Insights, Demandware

I had the privilege to team with Tom Litchford, VP of Technology at NRF, and Anita Bhappu, Professor at University of Arizona, on ground breaking research in the retail industry. The study, Digitising the Store – The Wave of Online and Offline Convergence, collected data from more than 200 U.S. and European retail business technology executives. The data showed that nearly 40 percent of retailers want a single platform to manage consumer interactions and transactions across all channels. And twice as many retailers plan to leverage eCommerce (38 percent) over traditional point of sale (POS) systems (19 percent) to support this strategic direction. In this article, we address some of the opportunities and challenges omni-channel enablement presents for retailers today, as well as some of the drivers encouraging retailers to make the change.

A Single Platform

The new retail reality will be better served by a single platform at the center of the consumer shopping experience. A single platform simplifies the technology environment, serving as the bridge between the virtual and physical shopping worlds. This capability enables retailers to move at the speed of consumers, while also consolidating and managing key data elements, business rules, and functionality that historically lived in multiple systems. As a result, retailers can deliver a seamless shopping experience across channels. And this aligns with what retailers are looking for. According to the study, retail leaders stated a single platform should enhance customer service, standardise business processes, and increase store associate productivity.

New Technology Breaks Traditional Store Formats

There will be significant store investments as retailers aggressively break tradition by launching new store formats, revamping aging locations, testing new concepts, and bringing digital capabilities into the four walls. In fact, 80 percent of retailers surveyed expect to maintain or increase store technology investments over the next three years.

Many of these investments are targeted at the traditional POS system, with 70 percent of retail executives reporting that their organisation is currently deploying or planning to refresh its existing software in the next three years. This POS refresh cycle––which historically occurred approximately every 12 years––has prompted technology leaders to rethink traditional store-centric software, along with other consumer-facing technology, and consider a single commerce instance across channels.

POS Supplanted by eCommerce Technology

Since the advent of the electronic cash register in 1974, traditional store-based POS systems have managed nearly all retail transactions. It took 30 years and the emergence of the Internet for another technology––and channel––to chip away at the status quo. In fact, Forrester Research1 predicts that eCommerce will reach $371 billion by 2017 in the U.S. alone, when it will account for 10 percent of all retail sales. Perhaps even more significant, Forrester estimates that half (49.5 percent) of total U.S. retail sales today are impacted by the web in some way––comprised of both direct eCommerce sales (8.4 percent) and U.S. retail sales influenced by shopping activity of some kind conducted online before consumers buy in stores or elsewhere (an additional 41.3 percent). Check out Tom Litchford’s take on the evolution of traditional POS.

Over the past 15 years, eCommerce functionality, architecture and extendibility designed for online shopping have surpassed store POS applications. As a result, traditional POS, call center and mobile technologies that directly interact with consumers are increasingly being supplanted by eCommerce to establish a single consumer platform.

image
 

Source: Copyright 2014. National Retail Federation. All Rights Reserved.

Retailers Must Proactively Address Key Technology Imperatives

Before retailers can truly reap the benefits of investments made in a single platform to connect the online and offline world, they need to consider the following three overarching imperatives to take advantage of this unprecedented change the retail industry is experiencing.

1.     Understand Market and Internal Landscape: Retailers must understand the broader marketplace and current state of technology solutions, as well as the internal structure of their organisation and key stakeholders.

2.     Establish Technology Roadmap: Retail executives we surveyed highlighted the challenges involved with complex migrations of legacy systems to a new environment and the need for significant capital investment for a “big bang” approach. The transformation to extend a singular platform across channels—particularly into physical stores—won’t and can’t happen overnight. Retailers need to develop their own technology roadmap that defines success, supports business initiatives and defines a path with clear milestones.

3.     Drive Continual Innovation: Consumer demands are changing at a rapid pace and innovation is currency in the shopping experience. Innovation and speed are not one-time events. Instead, they must become standard operating procedure, made possible by a flexible and scalable platform.

With simplified architecture, synchronised data and real-time intelligence, a single platform presents great opportunities for retailers to deliver a seamless and consistent consumer experience across any channel. But there are steps a retailer must take to ensure the evolution of the store is done at the retailer’s pace, in a manner that makes sense for their organisation.

0 votes, 0.00 avg. rating (0% score)

Insanity is doing the same thing over and over and expecting a different result

Written by Zahid Jiwa, VP UK&I, OutSystems

From the laptop to the laws of gravity, British ingenuity has shaped our world and we have a strong history of championing new ideas, innovations and inventions.  Britain’s capacity to excel in research and development has continued to be at the heart of our economic growth and today we are still one of the world’s leading innovators. As we slowly start to climb out of recession, growth prospects remain at the mercy of continued budget constraints combined with a growing IT skills shortage. Thinking about our legacy as a powerhouse of discovery and invention, it is clear to me that the only way that we can get out of today’s economic malaise is by continuing to invest in sustainable innovation.  However where IT is concerned, I think that’s easier said than done.

But before we talk about innovation, I think I need to stress that innovation doesn’t need to be ‘big bang’ or about radical change or an amazing new invention. Innovation is simply a new way of doing something.  Whether a new feature, a new service or a new process, innovation is often about incremental progress, which enables the business to work more effectively and efficiently.

According to CIO Magazine, business process improvements and increasing IT capacity to drive innovation is top of the list of most frequently cited IT management priorities.  However, according to IT Wire.com, over 25% of all CIOs also list innovation and technological change as one of the biggest challenges facing them in 2013.

So why does IT struggle to innovate?  Or to put it another way, why does the business perceive that IT can’t innovate?  I believe there are two mitigating factors.  The first is that according to Gartner, at least 80 of the IT budget is still spent on Keeping The Lights On (KTLO). So if the total spend on enterprise-wide IT is likely to be $2.68 trillion (as cited in a recent New York Times article), and 80% of this spend is allocated to what is essentially maintenance, you get the picture in terms of how much KTLO costs.   The second factor impacting IT’s ability to innovate is around the accumulation of new service or change requests made by business units, which puts a massive strain on IT resource. This creates an inevitable backlog which can result in a failure to deliver. IT departments are then often seen to be incompetent, slow, and by default unable to support innovation. The real failure here however is actually a lack of budget and resource, not incompetence or an inability to innovate.

The reality is that any business faced with the critical need for new applications and an IT department unable to respond will find a way to solve the problem by either building their own application, renting cloud based solutions that don’t need IT’s involvement, or hiring an IT consultancy to help them build the application.  However this doesn’t address the problem. In fact, for most companies it makes the problem worse.  Technology debt grows and backlogs get bigger. Unable to react, IT inevitably gets left behind and becomes an unsustainable business model. IT departments are then rendered as ineffective and irrelevant, relegated to being a cost centre unable to help the company innovate.

Organisations continue to walk down this path and expect to arrive somewhere different.   This has to change; IT has the ability to innovate in so many ways, if only given the chance.

1 vote, 3.00 avg. rating (73% score)
Posted in IT

Mobility: Powered by PaaS

By Karen Tegan Padir, CTO Progress

Mobility is amazing. Consider that it was not that long ago when mobility meant simply the ability to make or receive a telephone call or text. Maybe if you were lucky, you also got email or some pale version of a desktop app on your “device” so you could read documents on the go.

Now, mobile is one of the biggest of big businesses and, for developers, it’s a mixed blessing. Systems are being built or rebuilt to support mobility – and then there are all those apps yet to be written (or needing constant updating). That’s not a trivial challenge since there are multiple operating systems and innumerable individual device variations that have to be considered.

Add to those basics the challenges of delivering new kinds of functionality such as context awareness. This, at a minimum, means writing an app that “understands” its physical location at a given moment. But it can also include preference, in the form of recent decisions made by the user or consumer that can be discerned from the device or from information available on social networks; situation – meaning not just map coordinate location but altitude, environmental conditions, velocity, etc.; and evenattitude – the state of mind of the user or customer.

It is unlikely that any given app dev project will involve more than a fraction of these potential new requirements, however the point is that it’s a brave new world for developers: more complex and faster paced than ever before.

One of the best means for coping with this onslaught of challenges is simply adopting a new platform approach to development and deployment – platform-as-a-service (PaaS).

The PaaS Advantage

Why PaaS? PaaS is moving toward an increasingly central position in IT – whether on-premises, in the cloud, or in a hybrid situation. When developers turn to PaaS they do so because they believe they can save time and increase productivity. Given its inherent advantages, PaaS is the way that more and more people will choose to develop applications. Indeed, this approach is rapidly moving to the mainstream.

To date, developers have generally had to pick separate tool sets, depending upon their target deployment platform – one choice for fat clients and something totally different for web apps, mobile, and tablet. Developers had to learn to use different tools for each platform. If you decided your “fat” app needed to work for a mobile device, you may have had to put up with delivering a problematic user experience based on that tooling. If something was optimised for the web or a fat client, it might work for mobile but it wouldn’t work very well. The alternative was more or less starting over from scratch.

Developers will look to PaaS vendors to provide unified delivery of tools. When people choose to create an app they will increasingly look for a development environment that will have all of the required tools needed to easily create a mobile app, a web app, or a fat client, from one tool set. That is why developers will choose a PaaS that provides wide tooling breadth and strong integration. It’s common sense.

Similarly, even within the specific requirements of mobile development, there can be daunting complexities. If you choose to write a native app your testing complexity goes up considerably because you have different operating systems and different handsets. With hybrid approaches you can write the UI code once and deploy to multiple devices because you can have a container that runs on top of all operating systems – so you don’t need to learn device-specific environments for development.

By 2015 the choice to write a native app or a hybrid app will be increasingly dictated by tooling available through a PaaS, not just for deployment and runtime but also as the app dev tooling for creating the mobile front end.

Tooling will continue to improve, too, with developers turning to an integrated, “everything-in-the-box” mobile UI development platform so they can quickly and easily extend existing applications to support mobile users – or create new ones. In combination with a visual designer, support for the creation of feature-rich mobile apps than can run on both iOS and Android platforms is important as well. With hundreds of different handsets within each of those domains, each requiring a degree of application customization, this approach bridging capability represents a huge productivity boost.

Similarly, when developers look to create apps for the Internet of Things (IoT) and the next generation of mobility – wearable devices like Google Glass – they will want similar support. So application developers will have to think about picking tool sets not only to deliver for their already complex spectrum of choices but also for Glass – or a smart sensor, perhaps even a refrigerator.

Moving forward, application developers need to look at mobile development challenges in a broad context that includes traditional platforms as well as mobile and emerging IoT. It’s an overwhelming challenge, but increasingly capable PaaS providers will make it possible – and successful.

Indeed, the stampede of new vendors into the space, including IBM with its $1 billion commitment to BlueMix, is an indication that the whole industry “gets” PaaS. Gartner recently published their first Gartner Magic Quadrant for aPaaS, in which a handful of leading PaaS players are categorized as either challengers, leaders, niche players, or visionaries. Fundamentally, developers, and IT departments in general are dealing with stiffer competition and limited resources. PaaS helps them address those challenges. In fact, a PaaS has solid advantages in terms of supporting better and faster development, agility, analytics, and scalability, while offering more favorable cost structures.

This year we’ll see PaaS begin to drive a wave of change, within IT and across organisations. It’s also a perfect fit with the do-business-anywhere–and-any-time trend. Here again, consumers have shown the way, with smaller, more focused and easy to deploy apps that are better tailored to the needs of roles and individuals than the giant, “monolithic” applications that have long dominated corporate life.

In addition to the groundswell of vendor interest, many more end-user organisations are adopting PaaS. In short, I definitely see 2014 as the “Year of PaaS.” If you haven’t yet done so, take some time to learn about PaaS and think about building it in to your plans sooner rather than later. It’s the competitive edge you will need!

0 votes, 0.00 avg. rating (0% score)

Beach to Breach – Reducing BYOD Security Risks During the Summer Holidays

By Sean Newman, Field Product Manager at Cisco

It’s the time of year when people start booking their summer holidays, and for employers it is vital that they ensure their BYOD policies are rigorous enough to protect their business against any potential data breach while their staff are away enjoying a fortnight in the sun.

The balance between work and social life has become more blurred with employees able to access websites, social media and emails from their smartphones or tablets in or out of the office anytime and anywhere in the world. As a result, concerns around BYOD have increased. While companies recognise the benefits of mobile technology in terms of productivity and competitiveness, they are not always focused on the risk this poses in terms of potential cyber-attack.

There is no doubt that adoption of mobile devices in the workplace presents a challenge that is as much a question of policy and control as it is about the technology itself. According to analyst firm TechMarketView, over 10 million UK employees are predicted to be using personal devices in the workplace by 2016.

Manufacturers are pushing tablets as the must-have device for everyone in the family, whether it’s a high-end iPad from Apple or the new cost-effective Hudl from Tesco. What does that mean for the enterprise? It means an influx of new devices coming onto the network because, you can bet your life they won’t be staying just for the home.

For the IT security team this has the potential to be a real headache as they count the ways in which the BYOD trend complicates their work lives. And, as the transition from desk-bound computers to laptops, tablets and smartphones continues gathering pace, it’s no surprise that hackers are choosing mobile devices as their next target. It makes economic sense and they are simply ‘following the mobile money’.

The issue with employee-owned mobile devices is that they can access corporate resources outside of the control of the corporate IT function. This means it can be difficult to identify even basic environmental data for these devices, such as the number and type of devices being used, and the operating systems and applications they are running.

The proliferation of mobile devices and their growing use in the workplace has fuelled a rapid growth in mobile malware, significantly increasing the risk to individuals and their employers. Research indicates that 79% of malicious attacks on mobiles in 2012 occurred on devices running Google’s Android operating system, according to US authorities. Given the lack of even basic visibility, many IT security teams certainly don’t have the capability to identify potential threats from these devices.

However, despite the pitfalls, the benefits of BYOD are often too strong to ignore.  So, in order to regain control in this mobile world, IT security professionals must be able to see everything in their environment, so they can establish risk level and then secure it appropriately. For most enterprises, the right solution is to implement BYOD policies that clearly define the proper use of employee-owned devices in the enterprise and then have enough checks and controls in place to enforce those policies.

At the end of the day, security of mobile devices is ultimately a question of three phases:

  • Before – establishing control over how mobile devices are used and what data they can access and store.
  • During – Visibility and intelligence is vital if security professionals can hope to identify the threats and risky devices and monitor their activities on the corporate network
  • After – when the inevitable happens and the network is compromised by a threat, be able to retrospectively review how that threat entered the network; which systems it interacted with and what files and applications were run to ensure it can be cleaned up as quickly as possible.

Whilst employees need to remember the risks of spending too long exposed to the sun, when they are on holiday, organisations need to ensure the risks posed by their mobile devices don’t expose corporate assets to misuse or theft, otherwise they won’t be the only ones getting burned.

0 votes, 0.00 avg. rating (0% score)

Taking Control of Your Cloud

Lilac Schoenbeck, Vice President of Product Management and Marketing at iland

Cloud computing services are increasingly being adopted mainstream and are fast becoming an integral part of every organisation’s IT strategy. Many industry analysts are saying that cloud will become a major platform for growth for organisations and especially for mid-market businesses. This is because, previously, if an organisation wanted to get a new idea off the ground, they would often have had to make a significant upfront investment in IT before they even knew if their business idea was going to work. The cloud, however, levels the playing field.

When done right, cloud takes away barriers to entry and makes technology available to all organisations regardless of size. From day one, a business can ramp up very quickly and easily without having to make serious upfront capital investment. The move to the cloud is seamless. Costs are predicable. There are no big step changes or spikes in costs for maintenance or renewal requirements. Remote working and disaster recovery can also be built in.

However, because of this rapid growth and evolution, it could be argued that the definition of cloud has become somewhat unclear. Today, the term is used for everything from physical hosting “elsewhere,” to Gmail, to almost anything imaginable in between. It seems that the meaning of cloud is different to different organisations depending on how cloud services are being used.

Some companies view cloud as anything stored or accessed from anywhere that isn’t “actually here on this very piece of hardware.” To the average consumer, most view cloud as the internet. To them Gmail, YouTube and Pinterest are all in the cloud. But, pressed further, I think they would cite cloud back-up, for example, as real cloud. To business users, business apps are actually the cloud. They see marketing apps like Marketo and Eloqua and Hubspot all as cloud. Some may also cite web hosting. To an infrastructure user, cloud is a place to get external resources (CPU, RAM, Storage) which can scale up or down. Some clouds may include varying amounts of software. Usually infrastructure people draw a line between SaaS and cloud, unlike with PaaS which is currently being described as one of the fastest growing areas of all cloud computing services. For example, analyst firm Gartner estimates a steep rise in PaaS adoption and forecasts an increase in spending to more than $2.9 billion by 2016.

In the midst of all this uncertainty, one thing all users believe and fear is that once something is in the cloud it is completely out of their control. This misconception is the reason that many large enterprises and government organisations only use the cloud for testing and development. Data is brought back in house when the IT project is ready to go into live production. Equally, many organisations have data sovereignty issues (i.e. they cannot permit their data to reside on services outside the EU) which limits the extent to which they can utilise cloud services. Naturally, once something is uploaded to a sharing site, a great deal of control is lost.

But this doesn’t have to be the case for IT systems, as some cloud service providers are evolving to address this concern. If a workload is hosted in the cloud with a cloud service provider, users should be able to define the actual location of that workload so it can be as close to home as they’d like or farther away for disaster recovery purposes. It’s important for companies to look for a provider that offers this level of control.

This control is also important because, today, governments are still defining their laws regarding jurisdiction and access to data in their territory, and many organisations have preferences regarding which country they’d like to inhabit. If workloads can be sent flying around across national borders, users have lost a great deal of control over their own fate, and that of the data, which can be a costly trade-off. Check the fine print of your cloud service provider agreement.

No matter what their organisation’s definition of the cloud, users need to select infrastructure providers that are able to make it usable to the everyday business while addressing regional data sovereignty issues. At iland, we are growing to address the regional needs of our customers. Most recently, we added a data centre in Manchester and extended our operation in London. What’s more, through this growth, we’ve developed a turnkey process to rapidly expand to other countries as customers demand it.

0 votes, 0.00 avg. rating (0% score)