In Summits past, I have always let Peter attend all the “Top Features” sessions, because as a Systems Engineer (and management to boot) I wasn’t that interested in the “features” of AEM except for how the impacted me on the Operational side of things. However, now that I have made the jump over to the customer side, I took the opportunity this year to go to the “AEM Assets” Top 10 session hosted by Josh Ramirez and Elliot Sedegah.
Having just attended the “Top 10 Features of AEM Sites” earlier in the day, I was really surprised at the difference in tone and presentational style. Rather than a laundry list of features, we were treated to a much more dynamic visual and auditory presentation. Some of the announcements were really exciting and addressed very real pain points my AEM Assets end-users suffer under daily, while others seemed a bit too “high-concept” or conceptual for me.
1. User Experience
Josh kicked off the demo portion of the presentation with a handful of improvements that, while they might seem minor on the surface, have a lot of potential to improve the every-day working experience with AEM Assets. Highlights of this portion of the presentation included:
- fine-tuning the size of the “Cards” used in the “Card View” of TouchUI;
- ability to modify which columns are displayed during the “List View”;
- the ability to quickly navigate up and down multiple levels of folder using the “Content Tree” viewer;
- the ability to filter on more granular search facets within search term results;
- a new “Smart Translation” feature which uses machine learning to translate search terms into English before attempting to perform a match;
- showing the number of matched results on a search term;
- improved performance of lazy loading in search results;
- better integration with browser history - clicking the back button on an asset details page will maintain search context you came from.
As you can see from the list above, there are a lot of detail-oriented touch points that will really be appreciated by people who have to live in TouchUI. I’m really interested in seeing how the “Smart Translation” feature pans out, as translating metadata fields is a huge problem on the horizon for me in my daily use of AEM (my employer translates content into more than 100 languages - if we don’t have to manually perform translation of all of our metadata into all those 100+ languages, that would save us uncountable amounts of effort), but - as with all things “machine learning”, I’ll hold my excitement in check until I see real results delivered.
2. Metadata Tools & Rules
6.4 has some really good usability enhancements around metadata management. What felt like an after-thought in 6.3 and earlier has really grown into its own with the addition of cascading / contextual metadata, which allows you to create relationships between metadata fields, making it a less painful experience for those who have to enter the metadata to be able to drill down to the right “set” of metadata for a given asset. This allows you to hide unrelated metadata fields from the Properties page, only surfacing it when it becomes relevant - for instance, if the Asset type is changed, a different set of fields might apply.
Elliot’s presentation of these features really captured my attention here, as he talked about the inflection point between those who run the DAM and those who know the content.
On the DAM side, we know that assets which don’t have the appropriate metadata fields will not be surfaced well to the consumers who need the digital assets; while those who live in the content world have the information they could provide to deliver those assets to the users where and when they need them, but they’re often not engaged in entering the metadata into the system. For some people, it might just be one more thing in an already overflowing to-do list; for others, they might not understand the full impact of accurate and complete metadata; and for still another group, perhaps it’s just too painful to use the UI to enter the required information.
Regardless of why those content curators might have been resisting participation in the metadata population activities, Adobe is trying to strip away the excuses.
3. Smart Tags: Custom Made
They covered an improvement to Smart Tags which allows you to use your own custom taxonomy to train Sensei to tag your images with your own meaningful tags; while it sounds spectacular, I still have never seen the Smart Tags feature actually work in a production DAM instance. See my previous commentary regarding machine learning promises at the end of item #1.
4. Adobe Stock Integration
This seems like an obvious enhancement for Adobe - with this feature, every instance of AEM Assets also immediately becomes an advertisement for their Adobe Stock offering. By deleting the “Location: Assets” filter from your search term, a correctly configured AEM instance can also link in all Adobe Stock images which match the same term at the top of the results page. This also benefits highly from the optimizations made to the lazy loading, given that Adobe Stock will have a lot more assets than a lot of initial DAM implementations. Note that this feature was explicitly described as “Coming Soon.”
5. Adobe Asset Link
This, in my opinion, is the “killer app” of Adobe Experience Manager, and something I know a lot of AEM customers expected to see much earlier on. This allows you to log in to your AEM Assets repository directly from tools such as Photoshop or Illustrator, after installing a Creative Cloud extension. The ability to navigate the AEM Assets repository directly inside Photoshop has the potential to solve a lot of DAM adoption issues. This is another feature that is described as “Coming Soon.” My employer has recently been added to the beta for this feature, and I’m excited to share our experiences with it here on AEM HQ.
6. DAM Ecosystem
This was essentially a (re-?) announcement of the Adobe Exchange for AEM Assets. They demonstrated an example using Silicon Publishing to integrate AEM Assets with InDesign Server. This is another item you can expect to see future articles in AEM HQ covering, as it’s something we’ve been working for some time with Silicon Publishing to implement this feature internally.
7. Smart Crop
Here they covered the ability for end-users to retrain their machine learning engine (Adobe Sensei?) about what the important “focal point” in a given asset is. Using Dynamic Media, your AEM Sites can leverage the retrained Smart Cropped assets during the selection of assets at various breakpoints in your responsive designs.
Josh also demonstrated the new “Smart Swatch” feature, but I have to be honest here and admit I’m not graphically inclined enough to know what is exciting about that feature. I can tell you it will automatically identify dominant colors and patterns in your assets, and build a “swatch” out of those, but I don’t know what those are used for.
8. 360° and VR Experiences
This one was more conceptual than practical, in my opinion. There was an interesting panorama viewer they demonstrated which allows you to interactively pan through a panoramic photo right in your browser, and the ability to view your assets on a VR display. While on one level I applaud Adobe for trying to get out in front of the VR Marketing landscape, on the other I feel like it’s far enough out from being a realistic problem for anyone I know that they could have better spent some of those resources on some of the low-hanging fruit. If you’re of a different opinion and want to write/talk about your VR Marketing plans, please reach out to firstname.lastname@example.org and let us know - we’d love to feature you on an upcoming podcast or article!
9. Interactive Experience Fragments
With 6.4, you can edit “Experience Fragments” to create what Adobe calls “hotspots,” which are clickable regions that navigate to other sites, assets, pages, or Experience Fragments.
I don’t want to seem too dismissive of this feature, but the way it was presented at Summit made it seem like Adobe is pretty pleased with themselves for reinventing HTML 3’s “image maps”. Maybe I’m missing something, but it didnt really grab me as an important improvement.
There was also a demonstration of 3D rendering of a model out of Assets with the ability to interact with the 3D model, and to swap out different “stages” (for example, an American Southwest Desert vs. a West Coast Beach.) Again, perhaps it’s simply my bias showing through, but I am not sure who is really clamoring for this feature. Perhaps it is simply the bedrock for some future improvement which will really amaze me, but not today.
All told, for my first time attending a “Top Features” sessions at Summit, I walked away really excited about where AEM had come in the past 12 months, and I was able to reinvigorate my end-users about the platform and some of the things that would soon† become available to them.
†: I’m hoping to write a future article detailing some of our experiences upgrading a large AEM Assets 6.2 implementation to 6.4.