As a Google Partner, Panalysis wants to ensure our clients prepare for Apple’s impending iOS14 App Tracking Transparency (ATT) policy updates.
It is important for developers and advertisers to understand the implications of the ATT policy, how to ready themselves for the changes and how Google is complying with the amendments.
Developers will need to request permission when using certain information from other companies’ apps and websites for advertising purposes. This applies even if previous consent has been granted by users.
Developers are likely to notice an impact on Google ad revenue as a result of reduced visibility into performance metrics and measurement, including conversion modelling.
Google will be updating Google Mobile Ads SDKs for iOS (for AdMob or Ad Manager) to aid iOS monetization.
There will be a reduction in visibility into key metrics that show how ads drive conversions. This will affect how advertisers’ value and bid on ad impressions.
Advertisers will likely experience fluctuations in delivery and performance for iOS-run ad campaigns
Google will assist by using modelled conversions on more iOS 14 traffic and have launched new solutions such as Google Analytics for Firebase integration with Apple’s SKAdNetwork for iOS campaign attribution.
A selection of Google iOS apps use information (such as Apple’s IDFA identifier) and will require ATT, once updated. This information will no longer be used thus will not show the prompt in the apps.
Google has committed to providing transparency, choice and control when further investing in privacy technology in order to maintain an open-app ecosystem.
The Panalysis team will release more information about the ATT Policy as it is received. In the meantime, feel free to contact us to discuss the Policy, changes and implications for your business.
This month Google introduced its new version of Google Analytics, Google Analytics 4 Properties.
Designed to help businesses get the insights they need now and to prepare for what’s to come in future, the update was driven by changing consumer behaviours and privacy concerns.
With Marketers relying more heavily on data, analytics and insights, this release is aimed at helping users improve ROI from their marketing activities.
Building on the foundation of App+Web and with machine learning at its core to automatically produce insights and customer understanding, the new Google Analytics is now the default experience for new properties.
As such, it’s essential to understand the ins and outs of the changes to Google Analytics.
We’ve summarised the hey highlights from Googles release for you, below.
As a first step in responding to the changes, Google is encouraging Google Analytics users to create a new Google Analytics 4 property alongside your existing properties.
Contact Panalysis for more in-depth information about Google Analytics 4 and how the release will impact your business.
Google recently removed the Network Domain and Service Provider dimensions from Google Analytics reporting and have confirmed this to be permanent change, stating that they are “no longer available to report these dimensions in Google Analytics”. As a result, this data will not be available in the future.
THE POTENTIAL IMPACT
If your company is using these dimensions in your GA account, chances are you will no longer be able to easily and readily:
SUGGESTIONS & SOLUTIONS
As a first step, we recommend:
We know these changes can be frustrating and difficult to keep up with.
The Panalysis Team is here to help in your understanding of, and solutions for, your company to help you minimise the impact of these changes.
Please contact us if you need our assistance.
Google recently released an update to the Chrome browser, called Chrome 80. This update could potentially impact business processes of advertising technologies.
Your company could be adversely affected if your advertising is reliant on 3rdparty vendors using 3rdparty cookies for conversion tracking, frequency capping and audience creation and targeting.
Check out MD Rod Jacka’s opinion as he weighs in on the subject in CRN Magazines August 2018 feature.
When the concept of big data first burst onto the scene earlier this decade the focus was largely on the very big problems that very big organisations could tackle using very, very, big data sets.
Over time big data has shrunk a little. The tools that enable analytics have become more commoditised, affordable and, therefore, applicable to more modest projects.
But while the tools themselves might now be better suited to Australia’s 250,000 or so mid-sized enterprises, channel companies aren’t yet developing big data practices.
Why not? Because the opportunity in big data may not be the one that everyone expects, according to the Sydney-based managing vice president at Gartner, Ian Bertram.
“The opportunity is to sell how to increase data literacy and the ability to use the data that [clients] have got,” Bertram says. “Teaching them how to understand what their data is telling them and what needs to be done in their organisation is the opportunity.”
That means starting with the basics around data literacy, before moving them on to more advanced analytics, and eventually on to AI.
“That midmarket is very competitive, and they want to be able to pick up trends and that little bit of insight to get that competitive edge early enough, because competitive windows are closing much faster than in the old days,” Bertram says.
“And unless they are investing in some of these tools that allow them to pick up what is happening with their business early enough and give them time to react to change, then they are going to go out of business quicker.
“So there is the demand for it, they are very scared about where to start.”
That may be the case, but it seems to be an opportunity that much of the channel is yet to pursue. Even those organisations that are servicing midmarket companies generally do so as an adjunct to their main work with larger organisations.
Analytics might seem like a new opportunity for many channel companies. Not to Rod Jacka.
His 17-year-old firm, Panalysis, has provided these services for years. Panalysis was the first Google Analytics partner appointed in Australia, and while its focus remains in the digital space working with enterprise clients, a year ago he signed a relationship with Domo to help clients better work with data from different systems.
Jacka says Domo has proven useful for SME clients, helping them demystify data by presenting easy-to-consume dashboards and alerts that deliver owners what they need to be successful, such as whether they are on track or not. This has been of particular benefit to one of Jacka’s SME clients, the gift hamper company Gourmet Baskets.
“He uses a combination of tools, one of which is [marketing and sales management platform] Infusionsoft, and Xero for his invoicing,” Jacka says. “For him to understand his market and break his market into different stratas, he can’t do that in Infusionsoft easily. And so Domo has allowed him to connect this up and get visibility across each of the platforms he’s used in a single, easy-to digest manner.”
Jacka says one of the secrets to working with smaller clients on data projects is to never talk about data.
“The business owners know what they want to do, but they often don’t know how to do it,” Jacka says. “Data isn’t really what they are talking about, but they need data in order to achieve their goal.
“I actually prefer not to use the word ‘data’. I prefer to use the word ‘evidence’.”
Other organisations that now offer services in the data space often started out somewhere else. Melbourne-based automation and DevOps consultancy Vibrato built its analytics practice 18 months ago through the evolution of its devops consulting business to encompass the concept of dataops.
Vibrato’s chief executive and founder, Peter Gatt, says over time his firm has learned to package up analytics services into fixed-price solutions that are suitable for SMEs. He says smaller thinking is required to help smaller clients.
“With a small and medium business we say, ‘Let’s look at one analytical thing you are trying to ask or answer, and that traditionally reporting could help answer’,” he says. “It is not about bringing in petabytes of data and running Hadoop protocols. We are just going to coordinate the data they have into something that’s logical so they can start to answer questions about their clients and commercialise that data.”
Send in the clouds
In some instances Gatt says engagement with smaller businesses is actually easier, as it is possible to speak directly to the boss or function head, such as the CMO, who owns the actual problem needing to be solved.
When there this is a specific question to be answered through an analytics project, Gatt says Vibrato can deliver a return-on-investment through a proof-of-value in as little as four weeks.
“With data, that’s possible,” Gatt says. “But it really needs to be locked to answering questions, not just playing.”
Vibrato has also established relationships with technology providers such as Hortonworks, and makes use of the Snowflake cloud-based data warehouse tool to help accelerate data flows to make the fast turnaround possible.
Indeed, the migration of analytics tools to the cloud has proven beneficial for a number of channel companies, including Empired.
“I can stand up a Microsoft solution on my credit card and do it really quickly,” says Empired’s national business manager for data insights and integration, Ben Johnson.
“The other change is you used to have to knit all kinds of different tools together – Oracle for database and Business Objects for the front end and IBM for predictive analytics. All of that is blown away because the market is converged. Microsoft has an end-to-end data and AI capability that you can stand up quickly and start getting value straight away.”
However, reducing the barriers to entry from a technology perspective has also had other consequences.
“It’s new and it’s sexy and everyone is a data scientist and everyone is talking about AI,” Johnson says. “The realty is most of them are pretenders, because there has been such an influx of people who are interested but don’t understand the space.”
Are you experienced?
It seems experienced skills remains one of the key attributes that set the successful players apart from Johnson’s pretenders, and Johnson has hired a team of 70 specialists over the past two years to build out the necessary experience.
Jacka also rates staff development as one of his top priorities. He has hired everyone from scientists to mathematicians and artists, and then trained them up on the technology.
“The thing I look for in my team is thinking skills,” Jacka says. “If they have a good problem-solving mindset and aptitude for creative thinking, I like them. But it can take three to six months to get them ready, and I’d argue that it takes 10 years to become an expert.”
Justin Parcell, director of Arq Group business InfoReady, says the requirement to get the right skills meant he and founder Tristan Sternson personally oversaw the hiring of the first 60 or so staff.
Originally the firm was focused on consulting services focused on the information management domain, primarily around IBM environments, but more recently InfoReady has built a competency in AWS and Azure, and three years ago rebranded as a data and analytics organisation, with a greater focus on delivering systems that support operational decisions, rather than for strategic decision making.
He agrees that the shift to the cloud has opened opportunities to service smaller clients.
“As the services that are provided by the cloud platforms become more sophisticated, and as the cost to stand-up those solutions and implements them reduces over time, I think they are going to become more accessible to those smaller companies,” Parcell says.
But while the tools are becoming easier to stand and up work with, he believes that skills remain the limiting factor preventing other organisations from easily following in InfoReady’s footsteps.
“To become a fully-fledged data analytics organisation there is a significant barrier to entry,” he says.
Rod Jacka has been at the forefront of analytics in Australia for over 20 years. He is available for speaking and training engagements. Contact Panalysis to enquire about booking Rod for your next Analytics event.
Nothing causes senior management or a business owner to distrust the numbers faster than errors in the data. Once that trust is gone it’s very difficult to win back.
Whilst it is important that any user of Google Analytics Enhanced E-commerce understands that the transactions that are recorded will not match your accounting system exactly, they should be very close. Typically we aim for an error rate of less that 5% but have seen differences up to 10% on a healthy website with no tracking issues.
The net result of this data adoption is an industry in the midst of steady transformation — but what exactly does that transformation involve? Let’s take a look at the changes taking place:
Some of the key errors that might be influencing your data are listed below:
Whilst a small amount of missing transactions is virtually unavoidable sometimes errors can creep in.
Common issues that cause this are:
If you see that the total count of transactions is higher than expected it is likely that some customers can reload the confirmation page that sends the data to Google Analytics and cause that transaction to be double counted.
This can happen if you send a confirmation email to the customer that encourages to return to their order details and your system reloads the Google Analytics code when the user visits that page. It can also happen if the customer reloads their confirmation page.
The solution to this is to only send the data to Google Analytics the first time. This may require modifications to your shopping cart as many will allow for the full transaction details to be reloaded when the confirmation page is refreshed.
To check if your data includes any duplicate transactions, click on the link below to view a custom Google Analytics report and select your normal Google Analytics view when prompted. The number of transactions per transaction ID should only be “1”.
Duplicate Transaction Report
Whilst it is a little harder to implement if your store receives a large number of refunds this can significantly bias the data. Google Analytics can reverse transactions by sending a refund event. Removing these sales is important to ensure that you accurately measure the results of your marketing campaigns.
This is the number one issue that we are brought in to solve time and time again.
There are two main causes of this:
The first occurs when your payment gateway is on a different domain and the final confirmation page is on that domain. The symptom that you will see is that all sales are attributed to that domain. The reason for this is that the customer’s is tracked as a different user on the payment gateway site.
The solution to this is to implement cross domain tracking so that the user can be tracked consistently across both domains.
Tip: Use Google Tag Manager. It makes solving this problem much, much easier than doing this using straight Google Analytics code.
The second occurs when the customer is taken to a different payment gateway and then is redirected back to your own store to see the confirmation page. In this case you might see your own site, Paypal or some other payment gateway as the source of the sales from your site.
The solution to this is to add all of your sites and the payment gateway domains to the list of ignored referrers in your Google Analytics settings.
There are many other ways in which you can receive unexpected data. If you need help in solving these please contact us for support.
A common question that businesses ask themselves is “Which of my customers are the most valuable?”.
While this is a simple question, there are ways of answering this which can prove to be quite complicated. For example, how does the business define “valuable”? This could be customers who spend the most in total, or customers who have a large number of transactions. Alongside this there are other considerations, such as how recent the last purchase was, or the average basket size.
Luckily, we can use the Recency, Frequency and Monetary framework (or, RFM) to help us identify high value customers.
As the name suggests, RFM takes into account 3 factors for determining the value of different customers:
When did the customer last buy an item? Can we identify if a customer is “active” or has “lapsed”?
How many times has a customer made a purchase? Can we identify customers who make many purchases, or few purchases?
How much in total has each customer spent?
Using these three dimensions can help identify different segments of the user base which would respond to different marketing messages and promotions.
PowerBI is a business analytics service that delivers insights by transforming data into stunning visuals. We have loaded example customer data into PowerBI to create an interactive visualisation for you to explore the RFM framework.
After loading sample customer data into the PowerBI platform, we can look at the demographic profile of those customers who score highly in the Frequency and Monetary dimensions.
The demographic data can help marketers keep their messaging relevant, should they choose to reach out to the different segments.
A particularly interesting group of customers are those who have a high Monetary score but a low Recency score. In other words, they are customers who have spent a lot of money but have not done so in a while. The business may want to consider strategies to re-engage those customers as doing so would likely prove to be very profitable.
The screenshot below shows those customers who:
With the flexibility in filtering that PowerBI offers, it is a great platform for ad hoc customer segmentation for frontline marketers and stakeholders. There is also the ability to export the data once filtered, so users are really empowered to do what they wish with the data.
Panalysis is an industry leader in data and analytics.
Reach out to email@example.com if you would like to hear more about how we can help you with customer segmentation and visualisation using PowerBI.
Domo has recently released new Dashboarding functionality within their platform.
This is a shift away from the Page – Collection – Card way of organising visualisations, towards the more typical organisation people expect from the term dashboard. This move allows Domo to emulate much more closely what people have been building in Tableau, PowerBI and Google Data Studio.
With this dashboarding functionality there is now much more flexibility in the organisation and sizing of cards for the content’s end users.
In the Page – Collection – Card paradigm it was almost impossible to control with detail what pages looked like to users. This was because the cards placed on a page would organise themselves in a responsive-like fashion, depending on the users’ screen size. This meant that 2 cards which were side-by-side on your screen might appear one beneath the other on the users’ screens.
The dashboarding functionality puts an end to that. 2 cards placed side-by-side will always be side-by-side. If the screen does not fit them, then one of them will simply hang off the edge of the screen.
To showcase what this new functionality is capable of, I have designed a dashboard within Domo showing some statistics about Australia from the Australian Bureau of Statistics and the Reserve Bank of Australia. This data was collected into Domo using the built-in Quandl connector.
Within a Domo dashboard all the functionality of the cards remains so it is still possible to slice and dice the data as you wish within the dashboard.
Have a look at the quick video to see this in action.
Panalysis is an official Domo partner.
Reach out to firstname.lastname@example.org if you would like to know more about how Panalysis can help you with an existing or potential instance of Domo.
Since the introduction of DoubleClick in 1996, changes in technology have meant changes for digital marketers. With more channels, formats and data and an increase in consumer awareness around how they are being marketed to and how their data is being used, there has been a need to provide marketers with new tolls to make it easier to get better results with privacy at the forefront.
To meet these new realities, Google announced Google Marketing Platform at the end of last month.
The platform brings together DoubleClick Digital Marketing and the Google Analytics 360 Suite to aid in the planning, buying, measurement and optimisation of digital media and customer experiences. In one place, Google Marketing Platform.
The platform brings together DoubleClick Digital Marketing and the Google Analytics 360 Suite to aid in the planning, buying, measurement and optimisation of digital media and customer experiences. In one place, Google Marketing Platform.
Google Marketing Platform helps marketers to better understand their customers by offering tools to collaborate and share insights, achieving a customer- first approach to marketing.
Find out more about Google Marketing Platform from your Panalysis Contact.
Traditional offline marketing has always had an element of gut instinct driving it: marketers coming up with ideas and implementing them without getting to meaningfully assess the results — a campaign may seem successful, but is it better than an alternative would have been?
Digital marketing, though, is a different beast. Tagged and tracked to a granular level, it allows for an unprecedented level of optimization. And this tracking is only getting more in-depth as time goes by and more businesses come to see the value in extensive analytics that has long-since been recognized by the biggest brands in the world.
The process of identifying a target audience has always been as complex as the marketer wants to make it, but with a practical bottleneck preventing it from being significant at the distribution stage, it’s been of limited value. Why should you bother coming up with a narrow definition of optimally-relevant consumers if you can’t reach them?
The rise of triggered automation processes and complex platforms such as Facebook’s PPC service has turned this around — today, you can get very specific with your audience identification, knowing that you can actually target that audience in isolation if you simply choose the right parameters and workshop until the results come.
Want to reach only those people who’re of a certain age and fans of a particular product? If you use all the tools available to you, you can do it. This has various positive effects: marketing audiences get served materials more relevant to their interests, marketers can deploy more creative strategies, and tactics can be A/B tested and revised much more efficiently.
Digital marketing isn’t just about getting the results: it’s also about selling the services. After all, you can’t get results if you don’t have any clients, and you can’t win any clients if you can’t make a strong case for what you bring to the table. Back before high-level analytics became possible, there was a lot of showmanship to the process of convincing a client (much of which had little to do with actual results).
Now, though, any digital marketer with a great service and a strong record can form a compelling argument using their results alone — particularly if they use the kind of data visualization that can easily be achieved through software such as Google Data Studio (this can be complex, but it helps to get some assistance from suitable experts).
Essentially, the consequence here is that it’s getting a lot harder for snake-oil sellers to win clients in the digital marketing world. Simply saying “We do X, Y, and Z, and get results” isn’t enough, because prospective clients will expect to see the data backing up you claims — your average ROI, your typical timeframe for results, which channels you use most effectively.
For any truly skilled digital marketer, it’s a great time to be part of the industry.
Big data doesn’t just take from one stream: it draws in analytics from any and all available sources, creating cohesive views and making it clearer how the various elements of campaigns are working together (or against one another, as the case may be). With that power on tap, marketers are finding it vastly easier to devise broader strategies.
Think of how a retailer today can bring numerous channels together. Setting their stalls on rich data-collecting platforms such as Shopify’s multi-channel ecomm CMS, and using versatile automation and integration tools such as IFTTT and Zapier, they can build customized stores, niche landing pages, and numerous traffic drivers, all while knowing how all the factors coexist.
Before big data was ready for mainstream use, attempting a broad omnichannel marketing strategy would have been brutally arduous and required an exhausting level of manual investigation. But with data backing you up, you can achieve wonders. Any retailer today that settles for a single-channel strategy will struggle to keep up with competitors, no matter how finely-honed that one strategy happens to be.
On the whole, this is the biggest consequence of big data adoption. The more marketers understand about the various facets of their operations (the people they’re targeting, the materials they’re serving, and the funnels they’re moving consumers into), the better equipped they become to operate efficiently.
That means cutting tactics that aren’t working before they start to drain resources, making minor tweaks to enhance ROI, reaching only the most relevant audiences, and phasing channels and platforms in and out depending on their fluctuating popularity and influence. What’s more, this process is increasingly automated: programmatic PPC is already smoothing out the advertising pipeline, and there are even services such as Phrasee that are capable of generating ad copy.
The more data we collect, and the more ingeniously we use it, the better we can make digital marketing. Who knows what the industry will look like in a decade?