Tuesday, March 15, 2016

Improving the Quality of Adobe Dynamic Tag Management Delivery

The value of a tag manager is to provide a container to consume tags for analytic tools, AdWords integration, beacons for Floodlight or 3rd party alerting and dynamic tag deployment.
The Adobe Dynamic Tag Manager architecture distributes through multiple web properties, each containing global data elements, tightly coupled analytic integration tooling, along with flexible event and page rule management.

We would assume the most flexible set-up would be to create a single web property, hosted through a content delivery network or self-hosted library.
Within this single web property, there would be a production and staging version of tag management.
All of the lower life-cycle environments associate with the staging version of the DTM (Dynamic Tag Management) libraries.

The production live site would associate with the production version of DTM libraries, however, there is available a DTM Switch browser plugin for easy switching of debug mode and staging mode for dynamic tag management users. It is
available for both Google Chrome and Mozilla Firefox.  This DTM Switch plugin will enable debugging, validating change as well as trapping the page load performance.

DTM has been designed to be intuitive, enabling marketers to quickly deploy tags to reduce dependency on IT.

DTM however cannot be entirely loosely coupled from the rest of the solution.  There is common effort to optimize page load, reduce JavaScript errors as well as minimizing duplication.
The quality delivery of DTM may have some dependency on IT governance.  This education entry hopes to address those areas where IT governance takes responsibility.

General Delivery Architecture

Every page within the solution needs to have a pointer to the DTM library along with a satellite invocation point.

The only change required to integrate DTM into any solution would assume there is a common page rendering fragment such as the header, footer or some granular reference.

That single reference to the DTM library needs to be a synchronous file load into the web page compounding some load time.
The best route would be to impose the library into the CDN (content delivery network) so that it has the opportunity to be cached to dramatically minimize the impact.

Once the DTM library is loaded into the web page, then the inner functionality within DTM will deploy all of the injected script tools, data elements, event-based and page based rules.


As outlined in the diagram, it is assumed that all secondary threads (inner injected scripts) will load asynchronously, however, that deserves governance.
Part of the responsibility of IT governance is to keep track of all of the trigger rules imposed for each of the inner scripts.

The following page trigger rules each have various benefits:

  • Top of Page
    • Some 3rd party integrations may require global injection so that the script is loaded before subsequent page assets - this imposes strain on page load optimization
      • Sequential HTML: Injected into <HEAD/> below DTM library include script if <SCRIPT/> tags are used, otherwise is injected at top of <BODY/>
      • Sequential JavaScript global: Injected into <HEAD/> below DTM include script as JavaScript include <SCRIPT/>
      • Sequential JavaScript local: Injected into <HEAD/> below DTM include script as JavaScript include <SCRIPT/>
      • Non-sequential JavaScript: Injected as asynchronous <SCRIPT/> in <HEAD> below DTM library include script
      • Non-sequential HTML: Injected as hidden IFRAME and does not affect page HTML
  • Bottom of Page
    • Assets that need to be loaded before the page is ready to operate javascript operation - this imposes strain on the page load optimization
      • Sequential HTML: Injected after _satellite.pageBottom() callback script with document.write() prior to DOMREADY so that there is no destruction of the visible page
      • Sequential JavaScript global: Injected after _satellite.pageBottom() as JavaScript include <SCRIPT/>
      • Sequential JavaScript local: Injected after _satellite.pageBottom() as JavaScript include <SCRIPT/>
      • Non-sequential JavaScript: Injected as asynchronous <SCRIPT/> in <HEAD> below DTM library include script
      • Non-sequential HTML: Injected as hidden IFRAME and does not affect page HTML
  • DOM Ready
    • Assets that can be incorporated into the re-structuring of rendered tags just before javascript operation begins - this is the safest strain on page load optimization
      • Sequential HTML: Will not work because DOMREADY is active and document.write() will overwrite the page
      • Sequential JavaScript global: Injected into <HEAD/> below DTM include script as JavaScript include <SCRIPT/>
      • Sequential JavaScript local: Injected into <HEAD/> below DTM include script as JavaScript include <SCRIPT/>
      • Non-sequential JavaScript: Injected as asynchronous <SCRIPT/> in <HEAD> below DTM library include script
      • Non-sequential HTML: Injected as hidden IFRAME and does not affect page HTML
  • Onload (window load)
    • Assets that can fire javascript operation at any time after all other javascript operations may have completed, this is usually used for firing off pixel beacons to third parties.  This option is optimize for page load operation.
      • Sequential HTML: Will not work because DOMREADY is active and document.write() will overwrite the page
      • Sequential JavaScript global: Injected into <HEAD/> below DTM include script as JavaScript include <SCRIPT/>
      • Sequential JavaScript local: Injected into <HEAD/> below DTM include script as JavaScript include <SCRIPT/>
      • Non-sequential JavaScript: Injected as asynchronous <SCRIPT/> in <HEAD> below DTM library include script
      • Non-sequential HTML: Injected as hidden IFRAME and does not affect page HTML

Page load speed improvements

Based on the above concerns for the trigger rules, there is a high level approach to improving page load speed.

There are general page load speed considerations which incorporates both the DTM as well as all other page rendering assets.   Those general considerations are listed below:
There are essentially six areas that affect page load speed:

  1. Caching of site assets
  2. Creating site pages from the server
  3. Reducing the number of downloaded files
  4. Reducing the size of downloaded files
  5. Improving the connection speed to source files
  6. Converting sequential file loading to parallel file loading


Specific for the DTM concerns, the IT governance needs to create a check-list and ensure the following details are satisfied across DTM and other rendered page assets:

  • Self-hosted static file delivery that eliminates the dependency on third-party hosting and DNS
  • Parallelization of tag loading through asynchronous delivery
  • Tag killing options with timeouts that can be modified by the client
  • Dramatically reduced file size through the optimized dynamic tag management library schema
  • Enhanced file compression and delivery
  • Optimized timing and delivery sequencing that can be modified by the client
  • Client-side delivery of file assets that eliminates any additional server-side processing

The added responsibility of the IT group would be to focus on the above non-functional requirement of "Optimized timing and delivery sequencing that can be modified by the client".
The sequencing and timing can be managed through keeping track of all page trigger rules as outlined above.

Event-based rules best practices

There are instances to add additional analytic based on events such as button clicks or other user interaction.
Apart from DTM, there may be other scripts in the solution that is also depending on trapping user interaction on the rendered tags.

This does cause concern when double click protection against tags is required.   jQuery has some bind/unbind requirements when treating event trapping.

Other solutions also use Dojo which also has feature rich functionality.  The strategy is for IT to simply keep track of all event based rules whether in DTM and other script dependencies in the solution.
For double click prevention, these assets should always be regression tested upon any event-based rule change in DTM.

Publish Workflow

Across the tools as well as the rules based scripts, IT should provide governance over the coding best practices.

The general workflow in DTM does enable the opportunity for IT to govern the code through review.
For every change in DTM, there is a historical comparison between change, as well, there is notation coverage to which the marketing team to assign comments to associate with each change.
Additionally, the IT governance team can respectively notate comments to align with change.

The most important workflow state is the publish operation.



As outline in the diagram, all changes are queued into a "Ready to Publish" state to which can easily be tracked and reviewed.  Once the changes are reviewed, the "Publish" operation will push all the changes that can be delivered into production.

As part of the review operation, below are some considerations for the DTM coding best practices

DTM Coding Best Practices

Based on my previous education, Improving the quality of WebSphere Commerce customizations, regarding the delivery of quality eCommerce solutions, there are some key points that can be reiterated here for DTM coding.

Common themes of unit testing

Part of the responsibility to review submitted code is to ensure there is unit testing coverage.
It is best for the reviewer to request from the DTM code change recipient what was covered in the unit testing and how it was addressed.
As suggested earlier, there is a DTM Switch tag manager that does enable some testing in production, which would influence the most realistic unit testing results.
However, testing in production does not assume absolute unit testing coverage.  Based on review of unit testing coverage, a review may be able to brainstorm some outstanding concerns that may need to be addressed in unit testing.
The easiest approach to unit testing review is to always enter in the unit test cases into the notes under the change entry for the submission.

Practising defensive programming techniques

This is the single most important factor in governing code change especially when it comes to JavaScript.
Even though DTM does have outstanding seamless error handling (through the iFrame injection and or separation of tag script responsibility), there are still those cases where expected tags are missing on the page which breaks dependent code.  In some cases, when JavaScript breaks, the entire site could break.
Especially if the code is executing site wide, on every page, where some granular pages may not include those expected tags or contain custom formatting which is not under the control of the DTM scope.

Readable code

Keep the code readable.  Most of the time, the 3rd party may send over some coding that can easily be incorporated into DTM.  Most cases it is straightforward to title the rule based on the 3rd party and expected external support for the script.
However, with IT governance, re-write that 3rd party code in a manner which ensures that it can easily be understood in the details apart from depending on the title of the rule to define the operation.

Handle nulls correctly

Although it may seem repetitive to continually check for nulls for every single variable, do it so that the code is protected.  Optimistic coders always assume the best and they may always be right.  However, that one time there is a null or invalid data in a parameter, the code fails and when it is JavaScript that is imposed on every page of a component, the entire solution could fail.
Whenever tags are consumed for their data, always first check if the element exists, then check if the element is null, then consume the element data, then the code should be safe.
Additionally, whenever iterating elements or sub-elements of the page tags, check first if those elements exist first before initiating the iteration.

Catch everything

Every script which depends on reading tags, every script which pulls data, please provide the try catch section.  That way, if there is any unknown errors, at least the error will not disrupt the operation as well will not display in the browser debugger console.   JavaScript makes the catch easy so it does not add much overhead to impose this as a constraint that needs to be code reviewed.

Logging

It is very useful to debug output to the browser console.  The reason this is helpful is because the DTM scripts can then be validated during the page and dynamic script operation.
Wrap those logging with a debug flag so that only the staging version of the code is tracing the verbose to the browser console.  There is no value for end users to view the browser console activity.

Tag Data Enhancement

If you read this far into this education material, then here is some bonus material for you that will really help the entire solution.
Tools, AdWords, 3rd party beacons, floodlights and other operations tend to use the same tag information in the page.
Most of the tags are in various areas of the page and some of the tag data may not exist.
One operation the IT team could provide benefit to the DTM team is to supply a foundation of tag data available on each page through the solution.
Remember, Dynamic Tag Management depends on one main factor - tags and the information those tags carry.
Create a set of hidden tags which the same class name that contain the required information that can easily be consumed by DTM.
That way, the DTM scripts can easily iterate over the acknowledge class and with the array subsequently inspect each tag, inspect the information, operate the logic and contain the integration points.
This will reduce the dependency on ensuring there is that alignment between spurious tagging expected on pages, while enhancing the operational support when integrating with analytics and 3rd parties.

Validation

When the tag manager submissions are published, the site will consume the changes and the tag behaviour will engage.  There requires the real-time validation to ensure that tag information is collected properly and communicating with 3rd parties.  There are two tools which can help facilitate this validation:

Ghostery

The Ghostery plugin quickly and easily installs into the browser.  Once it is configured and enabled, it will begin providing feedback to the browser to alert all of the trackers that are executing on any web page.

The diagram below illustrates the tracker alerting capability:

For each of the trackers that are listed, you can click into those communication specifications to drill into the data that has been collected while also validating the endpoint integration with the 3rd parties.  You can capture the communicated URL's that are constructed out of the trackers and share that with your integration point of contacts.

Google Tag Assistant

The Google Tag Assistant behaves much like Ghostery in that it responds to the activity that is captured and communicated to third parties based on the tagging scripting.

Below is a snapshot of the trackers that are exposed through the Tag Assistant:


Tag assistant provides a lot more debugging insight to help you understand where there may be errors in the tagging scripts.   Google Tag Assistant is tightly coupled with known trackers so only specific implementations can be debugged.

The added value of the Tag Assistant add on is that it can provide details low level tag information based on the metrics intended to be communicated to the third parties.

Below is a sample of what appears in the metrics report for a particular tracker:


Based on these addons reviewed above, you can get adequate coverage to validate that the tag management scripts are effectively capturing the right information and communicating with third parties as per the specification expectations.



7 comments:

  1. This is very useful information and it is used for data stage learners.kbs trainings also provide data stage SAP Hybris Training.

    ReplyDelete
  2. Thereupon, there is no purpose behind you or anybody out there not to propel here their instruction and vocation. Perusing this bit of composing demonstrates that you are truly genuine.

    ReplyDelete
  3. Open seeing increasingly compares higher education benefits with gaining a http://collegefootballtopten.com/college-essay-help-online/ "great job" and accepting "higher compensations."

    ReplyDelete
  4. The ramifications of this is education is a procedure that prompts the making of an item.course deals

    ReplyDelete
  5. may incorporate inquiry from every one of the fields. Like legislative issues, history,geography and other same sort of inquiries. APA citation

    ReplyDelete
  6. Extremely pleasant article, I appreciated perusing your post, exceptionally decent share, I need to twit this to my adherents. Much appreciated!.  extensions for Magento enterprise

    ReplyDelete
  7. Therefore the potential solutions have to deliver an easy to manage solution that has the capabilities to be adjusted to detailed "niche" market needs.kūdikių prekės

    ReplyDelete