Thursday, 21 May 2020

Why Digital Transformations Fails



Organizations are all feeling the sudden and increased urgency of digital transformation.  But there’s still a lack of clarity on what digital transformation actually means.  Definitions vary widely between companies and within companies. Me thinking, when a term can be stretched to mean just about anything, it starts to mean nothing. Urgency without clarity can be a risky combination.

The biggest challenge in today’s world is the language related to digital transformation. The term has been co-opted by every IT marketing person selling anything from personality  to followers.

Around the world, organizations are spending about a trillion dollars a year on digital transformation. Seventy percent of those transformations fail. I believe that this is happening because language prevents business and public sector owners from setting the right end goal. It also prevents them from following a very disciplined approach to getting there.

report last year from CompleteSpectrum showed a confusing mix of definitions of digital transformation from senior leaders.  These senior leaders only agreed on one thing — 94% reported that digital transformation was high on their list of priorities.  

I think it makes sense that digital transformation will vary for a bank versus a university versus a toothpaste brand, but it’s critical for teams within an organization to have clarity.

What i want to do is to be precise and talk about technology which is at the heart of digital transformation.  

Friday, 11 January 2019

How AI (Artificial Intelligence) is set to transform digital marketing this year



Digital marketing is an ever-changing landscape, thanks to the strides in digital technology. The prevalence of smartphones and tablets, along with internet penetration, have added more power to digital marketing. Going digital means using the entire gamut of technologies - from the use of artificial intelligence to wearables. It’s no longer just about having a website or email marketing.

As a brand, making an impact on every customer (internal or external), and opening up multiple channels of communication for customers is no more a luxury, but a necessity.
Businesses know that the present and future of marketing is undoubtedly digital. Millennials and Gen Z-ers are living in a digital age — using smart devices to work, party, commute, and play. In fact, the boundaries between what constitutes marketing and smart devices and technology are a blur, and the medium has become the message.

Bots for company

What trends should marketers be eyeing right now to get their message across? Artificial Intelligence (AI) has already become an intrinsic part of our lives without us paying too much attention to it. Google’s machine learning algorithms are at work when you search and get a satisfactory answer. Similarly, Facebook uses algorithms that learn from what we read or watch. There are intelligent tools that generate and curate content or provide an enhanced user experience on websites.

Programmatic advertising

Another area of significance for digital marketing is programmatic advertising. Programmatic advertising is essentially serving advertisements to targeted and specific users by using data. When AI is integrated into the process, programmatic ad placement can only get better. Natural language processing (NLP) has the potential to understand the message and real meaning of a specific piece of content and see how an ad can be an ideal match for that content.

Going personal

Observe how every trend is geared towards addressing the customer as if they are the only ones the brand is talking to. Personalised messaging is the biggest trend the digital marketing world is seeing and will see in the near future. This could be in the form of personalised notifications on apps, personalised emails, content, banners, etc. Just log into your groceries app or a video streaming service, and you will know all about personalisation. Your Netflix recommendations or follow suggestions on social media are also based on your own history and use patterns.

Tuesday, 25 September 2018

How To Choose The Right ERP Deployment Destination



Should you run your ERP system on-premise or in the cloud? Which approach is best for your business? If you’re tasked with formulating the IT strategy for your company, the advice on offer can seem conflicting and less than helpful.
Well, sorry to muddy the waters a bit more, but the truth is, it’s not a straightforward choice between cloud or on-premise deployment. There are numerous models, combinations, and hybrid solutions to consider. However, rest assured that your organization can find the right fit.
Over the coming weeks, i will be looking at the pros and cons of the available deployment approaches. Let’s start by looking at the three categories of deployment destination.

Public cloud

A public cloud service is potentially the most economical, because the service providers host many organizations on the same stack:
  • Physical infrastructure (including utilities, security, and disaster recovery)
  • Operating systems (OS)
  • Database systems (DBMS)
  • Application systems (the ERP application itself)
Although your data is kept separate, you share everything else with other customers.
Because your software vendor manages the infrastructure and application for you, you can get up and running quickly and scale up or down when needs change. With regular updates managed by the software provider, you can take advantage of the latest features and innovations without getting bogged down with maintenance. And with subscription-based licensing and no need to purchase hardware, this option requires minimal up-front investment and predictable costs that can usually be accounted for as an operating expense rather than a fixed asset.
Standardized ERP packages enable you to implement best-practice, streamlined processes across every business function, including finance, HR, customer relationship management, and supply chain logistics. As a result of all this, public cloud is often the best choice for startups, smaller companies, and subsidiaries.
The same standardization that drives down costs, however, also leads to less flexibility. Public cloud ERP will typically not allow modifications or support all industry-specific workflows or the varying processes of individual business units. Some organizations might not even be able to run ERP according to a standardized approach.

Infrastructure as a service (IaaS)

Another option is to run ERP on an infrastructure provider such as Microsoft Azure, Amazon Web Services, or Google Cloud Platform. They provide the physical infrastructure and leave the OS, DBMS, and ERP to you. In other words, you share the physical infrastructure running your applications, but your organization is the sole user of the database and ERP software. Essentially, you go public in one part and private in the other.
In this case, your organization is responsible for software setup, maintenance, and updates. Why would you want to take on this work when there are vendors that will manage it for you? Because you have more control. For example, updates typically happen every quarter with public cloud ERP. This might seem like a good thing at first, but perhaps your organization cannot absorb such rapid change with ERP. Also, you might want to run industry solutions not available in the public cloud, or customize processes beyond what public cloud can support. Instead, you can manage the updates and customize processes according to specific business needs.
While it won’t be as economical as the public cloud, running ERP on IaaS allows you to commoditize infrastructure, yet maintain almost as much control as an on-premise deployment.

On-premise

If you want complete control, an on-premise deployment is for you: You’re in charge of managing your entire ERP environment from the physical layer on up. You can adapt the software as required to respond to the business, and customize it to fit processes intrinsic to your industry and business units.
With this more traditional approach, of course, the responsibility to deploy and maintain the software and IT infrastructure is up to you, with or without the help of a third-party systems integrator. And it is more difficult to scale the system up or down.
There’s another variation on this theme to keep in mind: using a hosted data center offered by a company like IBM or HP. With this, you outsource the physical layer to get capabilities beyond what your own IT team can provide, especially with disaster recovery. Many companies also offer various application management packages as well, to perform maintenance and upgrades with more efficiency than you could do on your own.

Keeping your options open

While it might indeed seem complicated, once you weigh the options and consult with your stakeholders, you can map the strategy that’s right for your organization. A way to simplify all this for ERP is to consider your choices over five major dimensions:
Public CloudInfrastructure as a Service (IaaS)On-Premise/Hosted
EconomicsBestBetterMost costly
Business processesCore processes, standardizedAll processesAll processes
CustomizationWithin boundaries onlyOpen for ERP, within boundaries for infrastructureCompletely open
MaintenanceDone by vendorDone by vendor and customerDone by customer
Innovation paceQuarterly updates, done by vendorAnnual updates, done by customerAnnual updates, done by customer
What’s right for one part of the business may not work for another. Many companies create a multi-tier environment, combining two or more deployment methods in a hybrid approach. For example, an organization might run subsidiaries on public cloud, but on-premise for headquarters.
What’s crucial is that the solutions work together. But beware: Many don’t.
Look for ERP that offers consistency in the code line, data model, and user experience, whether you choose cloud or on-premise. Your employees, whether based in an overseas subsidiary or the head office, should have the same user experience and be able to share information, run reports in the same format, and follow the same steps to complete tasks. This way, you can eliminate organizational silos and establish an integrated digital foundation for the future.

Tuesday, 13 March 2018

Here Are The 5 Microsoft Skills Data Analysts Need To Know


Microsoft programs are so commonplace in most business environments that sometimes being incredibly proficient in their use can be overlooked. Despite spending more time than you’d like to think about using Excel and Powerpoint, there are a ton of tips and tricks you’re likely not using yet. If you know how to dig deep into advanced functionalities with these programs, you'll vastly improve the way you do your work.
If you work with data regularly, you'll find the Microsoft Data Analysis Bundle incredibly helpful to your everyday workflow. And even if you don't crunch numbers for a living, being a data-driven employee and candidate only increases your value and employability. This shows you how to use Microsoft Power BI, Advanced Excel, Advanced VBA and Advanced Microsoft Access with total proficiency.
More specifically, these are a few of the skills to learn:
How to make your data interactive: Spreadsheets can do more than help you organize information. One of the best use cases of Advanced Excel is automating the development of advanced graphs, creating a visual that's powerful enough to be actionable from a business standpoint.
How to develop compelling data visuals: Data can tell a story and Microsoft Power BI gives you ways to present your data in compelling formats, performing complex data modeling relationships and generating dashboards that you can share with colleagues and clients.
How to automate complex tasks: We all have to do mundane, tedious tasks sometimes. With Advanced VBA, you can write and implement Excel events, which automates these tasks. It's also CPDUK-accredited, so it looks good on your resume (especially when you say you're a lifetime learner).
How to hone advanced techniques: Whether you have a basic knowledge of Microsoft programs or consider yourself a power user, there are ways to streamline your workflow and be more efficient with your time. Using Microsoft Access, you'll learn how to create and maintain macros and even discover advanced options for the use of forms.
How to solve complex problems: Whether you're a data analyst or just want to find ways to use data to do your job more efficiently, this bundle helps you find ways to use Excel to figure out answers to complex business problems, whatever they might be.

Wednesday, 7 February 2018

Digital Transformation will kill ERP

It is a strong statement but at this time nothing is more certain than change. When giant corporations are created from a basement in San Francisco or a student hall in Harvard, it is not hard to question the future of a software. If Digital Transformation can “disintegrate” enterprises why could not destroy a concept? 

Forget about ERP and the way we run enterprises for years, despite technological changes, everything is kept in a book. The book of transactions called general ledger. Payable, receivables, cash flow, fixed assets and purchasing.

Why? In a simple sentence? Because we need to know if we can take money home! And does not matter if is a small-one-owner or a large shared public company. It is the same. But if there is not a centralised book? We need to know if we can take the money home still. But if there is no money? 

It is a technology for a new generation of transactional applications that establishes trust, accountability and transparency while streamlining business processes.
It is a design pattern made famous by bitcoin the reason I used in this text but its application go far beyond. With it, we can re-imagine the world’s most fundamental business interactions and open the door to invent new styles of digital interactions.


It has the potential to vastly reduce the cost and complexity of cross-enterprise business processes. The distributed ledger makes it easier to create cost-efficient business networks where virtually anything of value can be tracked and traded—without requiring a central point of control.” 

Wednesday, 24 January 2018

Edge Computing vs. Cloud Computing: What’s the Difference?


You are likely hearing a new term now, edge computing. Technologists (and the press, let’s be honest) we to throw a word around before it is well-defined, and in that vacuum come a variety of guessed definitions, of varying accuracy.

The term cloud computing is now as firmly lodged in our technical lexicon as email and Internet, and the concept has taken firm hold in business as well. According to my observation “no cloud” policy will be as prevalent in business as a “no Internet” policy. Which is to say no one who wants to stay in business will be without one.

Edge computing is a term you are going to hear more of in the coming years because it precedes another term you will be hearing a lot, the Internet of Things (IoT). You see, the formally adopted definition of edge computing is a form of technology that is necessary to make the IoT work.

And again as a victim of the same, let me brand edge computing is a “mesh network of micro data centers that process or store critical data locally and push all received data to a central data center or cloud storage repository, in a footprint of less than 100 square feet.” I think I did not lose you.

Meet Edge Computing, processing data at the edge of the network  where it is taken in has a number of benefits, starting with reducing the latency and makes connected applications more responsive and robust. Some applications might need immediate response, such as a sensor for failing equipment or for detecting a break-in.

It also takes the computation load off the data center if data can be processed and reacted upon at the point of origin rather than making the round trip to and from the data center. So it reduces the burden on both the data center and the network.

You may hear edge computing referred to by other names than micro data centers. They include fog computing and cloudlets. Fog computing, or “fogging,” is a term used to described a decentralized computing infrastructure that extends the cloud to the edge of the network.

Cloudlets are mobility-enhanced micro data centers located at the edge of a network and serve the mobile or smart device portion of the network. They are designed to handle resource-intensive mobile apps and take the load off both the network and the central data center and keep computing close to the point of origin.

Monday, 4 December 2017

HADES (High-fidelity Adaptive Deception & Emulation System) an alternate reality that thwarts hackers by tricking them into believing attack worked


Cyberspace has become the new frontier for next-generation battle. As hackers launch more sophisticated attacks, security researchers are racing against time to develop effective cyber defences. Now, experts have developed a new system that could deter hackers like never before. HADES (High-fidelity Adaptive Deception & Emulation System) is the new next-gen cyber-defensive system – an "alternate reality" that has been designed to trick hackers into exposing their tools and techniques by making them believe that their attacks are progressing successfully.

HADES is the brainchild of security researchers at Sandia National Laboratories. It is essentially a system, which clones the targeted environment a hacker aims to breach. When an attack isdiscovered, instead of immediately cutting off a hacker's access into the system, the attacker is lured into HADES. The alternate reality provided by HADES allows the hacker is to carry out the attack, without alerting him/her about already having been detected.

HADES also provides security experts with a unique opportunity to analyse hackers' techniques and tools in real time.
"Deception is the future of cyber defense," security researcher Vince Urias, who along with his team, created HADES, said in a statement. "Simply kicking a hacker out is next-to-useless. The hacker has asymmetry on his side; we have to guard a hundred possible entry points and a hacker only needs to penetrate one to get in."
So, a hacker may report to his handler that he or she has cracked our system and will be sending back reports on what we're doing. Let's say they spent 12 months gathering info. When they realize we've altered their reality, they have to wonder: at what point did their target start using deception, at what point should they not trust the data? They may have received a year or so of false information before realizing something is wrong," Urias explains.
By the time the attackers eventually figures out that something is wrong, they would have already exposed their methods and tools. "Then he's like a goldfish fluttering in a bowl," Urias said , "He exposes his techniques and we see everything he does."
However, HADES has one disadvantage – the more complex the deceptive environment, the more CPU power and memory resources required to deploy the system.
HADES has already allowed security experts to locate malware introduced into a system by an attacker and is capable of active attack. The US Department of Homeland Security (DHS) is working with Sandia to deploy it.
The unique system may be helpful in barricading against threats, while simultaneously gathering information on adversaries.


Why Digital Transformations Fails

Organizations are all feeling the sudden and increased urgency of digital transformation.  But there’s still a lack of clarity o...