Tag: Oracle CX Cloud - Commerce

With a growing pressure on IT departments to increase flexibility of their commerce systems and enable faster pace of innovation, many Oracle Commerce customers are looking at ways to evolve their legacy systems.

Some consider moving to SaaS, while others are looking at alternative on-premise solutions - and many are building platform-agnostic front-end while also extending investment in the Oracle Commerce platform. Regardless of which way you are leaning, understanding benefits and challenges associated with different technology approaches as well as learning from other Oracle Commerce customers’ experiences will help you develop a plan that best fits your business needs.

In this webinar playback, you will learn:

  • Strategies and technology approaches Spark::red customers are implementing to evolve their legacy IT systems to increase business agility and support strategic initiatives
  • How to extend investment in the Oracle Commerce platform while also taking advantage of leading-edge innovations such as cloud, headless architecture and microservices
  • Pros and cons of different Oracle Commerce evolution strategies and technology approaches
  • Common challenges and considerations when developing and implementing your Oracle Commerce evolution strategy
  • Customer cases studies with examples of specific technologies implemented, results achieved and lessons learned

This spells out the components and software use in the Sites Demo integration with Endeca Guided Search. All individual components in the zip archive are listed, as well as the required WebCenter Sites software and Endeca software used in the demo. No productized code from either Endeca or Sites is included.

RL Client manipulates file-based assets, such as slots, scenarios and targeters. This can be useful for reloading assets, as well as moving them between environments – for example from production to quality assurance. Once you gather your assets, this tutorial walks you through how to create a file manifest and load the assets you need.

This post deconstructs Endeca application scripts and reconstructs them in Java, revealing their inner workings and familiarizing developers with Endeca CAS, RecordStore and EAC API’s. These solutions may be useful to Endeca application developers who need more flexibility and control than that available by default scripts, and those who prefer to work in Java over BeanShell and shell scripts.

Proper care and feeding of your Content Acquisition System generational record store will go a long way towards ensuring satisfactory uptake of your ATG product data into Endeca. This post tells you how to update your CAS record store retention times, change the default maxIdleTime and to safeguard the baseline_Update process. These tips apply to Endeca releases prior to 11.0.

The whitepaper Three Patterns for Integrating WebCenter Sites with Oracle Commerce v 1.1, is now available for download.

Having clean log files are critical. Many modern software solutions are complex, and as that complexity increases, understanding nuances of every integration point and code interaction starts to become impossible. This post discusses the importance of having clean startup logs that are error free. Whether you are running performance tests, regression tests or are a developer working on a new feature the address errors. They matter.

ATG Dust is a Java unit testing framework based on JUnit meant for use with Oracle ATG Commerce. This post explains how ATG Dust works and best practices for using it. By running unit tests through the Dust framework, you are actually starting an instance of Nucleus, and executing your test cases against a running instance of Oracle ATG Commerce.

TCP/IP Tuning
Premium

An overview of TCP/IP tuning. This post lists and describes the common TCP parameters and explains key factors that will affect your buffer needs, and how using jumbo frames can enhance performance. We show you how to set TCP parameters on Windows, Solaris, AIX, Linux and HP-UX, and lists the common parameters on each system.

A large project is defined as having 200,000 assets or more. There are several ways to speed up deployment, i.e. disabling the option to purge deployment data automatically. Note that DAF deployment process data must then be purged manually after this process completes.