ActiveBatch Enterprise Workload Automation and Job Scheduling Overview


About & Video Transcript



Summary: Take a tour of ActiveBatch Enterprise Job Scheduling and Workload Automation, and learn how ActiveBatch's rich automation framework helps users integrate disparate applications, databases, and technologies into end-to-end workflows. 

Please note: Live Closed Captioning may be enabled by clicking "CC" on the video playbar.
 


- [Instructor] Welcome, in this presentation, we will introduce ActiveBatch, which will include discussing the three-tier architecture, learning some terminology, examining a use case, then closing with a summary. ActiveBatch is an enterprise Job scheduling and workload automation solution developed by Advanced Systems Concepts, Incorporated.

It allows users to achieve end-to-end automation and run their workflows across a variety of different platforms, applications, and technologies. It automates the dispatch of repetitive processes to run on systems where the ActiveBatch software has been installed. These repetitive tasks can include batch processing, which typically involves the execution of a series of computer programs and scripts. Additional repetitive processes include running real-time business processes, which is the delivery of information about business operations as they occur.

Typically, the automation tasks, called Jobs, require little-to-no human interaction. The Jobs can run in an unattended, background mode. Job examples include running scripts and executables, such as a PowerShell script, or a UNIX shell script. You can also run Jobs Library Jobs, which is a Job type that provides organizations with the tools to build and automate their workflows quickly using production-ready Job Steps. With this Job type, ActiveBatch provides the tried and proven code for the Job Steps, and Job authors provide the property settings.

Just a couple of quick examples. We have Job Steps for sending out emails or transferring files via FTP and secure FTP. Jobs can start in a variety of ways, such as via a schedule or an event. As an example of an event, an email could arrive in a user's email inbox that ActiveBatch is monitoring. When that happens, a Job will start. ActiveBatch consists of three key components. Let's go over them now. First, there's the client component named AbatConsole, and it's the application that users connect to a Job Scheduler with. So it's the user interface. This connection is done on demand, where users can connect to the Scheduler, do their work, and then they can disconnect from the scheduler when they are done.

The next key component is the Job Scheduler. This is the heartbeat of the system. It connects to Execution Agents and dispatches Job to this component. The Scheduler to Agent connection is a continuous connection. The final component is the Execution Agent. And this is the component where the Jobs actually run. Oftentimes, it's simply referred to as the Agent. We support a variety of operating systems that the Agent can be installed on. This is the only cross-platform component.

The Job Scheduler and Console applications must be installed on a Windows operating system. Let's look a little more at each component. The Client, named AbatConsole, which is oftentimes simply referred to as the Console, is the Windows user interface. It is the thick client that is part of a basic installation of ActiveBatch.

Just so you know, Advanced Systems Concepts does have other user interfaces, one of which is the Web Console where you connect to the Job Scheduler with one of the more common web browsers, such as Google Chrome or Microsoft's Internet Explorer. The console runs on Windows 10. If you were to install the Console on a server, for example the Job Scheduler server, and connect to the Scheduler with the local AbatConsole, of course it would run on that server as well. That would be Windows Server 2012 R2 through Server 2019, at least of as of this recording. More on that in just a moment.

As stated, the Client is the user interface to to Job Scheduler. Developers use the application to create and maintain Jobs and other object types, such as Schedules. Operators oversee the runtime environment. They make sure the Jobs are running successfully as designed. Administrators configure the system. For example, they set up policies and configure security. With regards to the operating systems that ActiveBatch supports, for all of our components, please see the knowledgebase article titled ActiveBatch Hardware/Software Platform Compatibility for the latest system versions supported.

Let's look at a little more information about the Job Scheduler. It runs as a Windows service 24/7. The Windows operating systems supported include the ones you see listed here. If the ActiveBatch Job Scheduler system is rebooted, the Job Scheduler service is automatically started. The Job Scheduler schedules and dispatches Jobs to Execution Agents. It updates Console users with Job statuses. It also updates the backend database. The database types that we support will be discussed shortly.

Let's look a little more at the Execution Agent. It's the system where your Jobs execute, once of course you've installed the Agent component. The Scheduler dispatches Jobs to the Agents, and the Agent reports back to the Scheduler when Jobs are complete. It's the only component that can run on Windows and non-Windows platforms. When running on a Windows system, it runs as a Windows service 24/7, and like the Job Scheduler, these are the currently supported Windows operating systems that you can install the Agent on.

Let's look at the platforms supported for the non-Windows Execution Agents. You can install the agent on a UNIX, Linux, and Mac operating system. The Agent runs as a Daemon. We offer an HP NSK Agent that also runs as a Daemon. Another OS supported is OpenVMS. The Agent runs as a detached process. Lastly, we support an AS/400 Agent, where that Agent runs as a batch process. Please note, you can also run an IBM mainframe Job, where the Job will run on a Windows Execution Agent, and that Agent will submit a JCL file to execute on the z/OS mainframe.

In addition to the three key components, ActiveBatch requires a backend database. This database consists of all the ActiveBatch stored procedures, views, and tables, where the tables are populated with Jobs, Schedules, and other objects an ActiveBatch developer creates. It also stores all the data regarding the history of executed Jobs. This is called the instance history. The Job Scheduler application updates the backend database, so any time you install the Job Scheduler component, you must specify what database application you will be using. The choices are listed here. Customers will typically select the database product that they currently have already invested in.

And here are the databases we support for the ActiveBatch backend database. Microsoft SQL Server in the versions you see listed here. Microsoft SQL Server Azure, where just as a quick note, we require that you manually create the database using the Azure portal to be certain of the price and fees associated with the Microsoft Azure service. We require that the database size be an S4 or larger database service. Lastly, we support an Oracle database, and the versions are listed here.

Let's take a look at a sample environment. A single user launched the client application, AbatConsole, to connect to a production Job Scheduler to do their work. The Job Scheduler is dispatching Jobs to run on two Execution Agent systems. We have many users that maintain multiple Scheduler environments for development, test, and production, just as a few examples. This is not a requirement, but it would be considered a best practice, allowing users to fully test Jobs as well as ActiveBatch software releases, before making changes to a production scheduler environment.

In this example, the test scheduler is dispatching Jobs to a single test execution agent. We do support a single client connecting to two or more Job Schedulers within one AbatConsole session, providing they have the appropriate permissions granted. For every Job Scheduler, there is always a backend database that all your Jobs and Job history will be stored in. The database is configured during a Job Scheduler installation. And as you know, we support a couple of different relational database management systems. It's your choice which one to use.

There is one Windows installation kit that contains all three key ActiveBatch components. There is a features window you will encounter during the installation, allowing you to select which component or components to install. Let's look at a few ways that ActiveBatch is commonly installed. A user would typically install all three components on one system designated as the Job Scheduler machine.

By the way, as a best practice, the ActiveBatch backend database would typically be installed on a different system for high availability reasons. The Execution Agent only would be installed on one or more systems designated to run the Jobs. The client only would be installed on one or more systems designated for users that need to access the Job Scheduler to do their work.

Just as a quick mention, the non-Windows Execution Agents, such as UNIX and Linux, have their own separate installation kits that are clearly noted at our website on our product download page.

Let's talk a little bit about the ActiveBatch design. Developers create ActiveBatch objects for which there are 14 objects that come with a base license. One example is a Job object. All objects have properties that are set. Some are required, some are optional, and some have default values configured. The properties describe the object, the things that are. Since Job properties define a Job, they are often called Job definitions or Job templates, in addition to simply a Job or Job object. Many objects can be shared with one or more Jobs and Plans. We'll discuss what a Plan is when we take a look at our use case. Some of these shareable objects are schedule objects, User Account objects, and alerts, just to name a few. So there is this concept of object reuseability. That is, a single object may be used by multiple Jobs.

For example, you may have five Jobs all triggering on the same dates and times, so they could be sharing a single schedule object. If that schedule object is changed, all five Jobs would be impacted by that change. Some objects create instances. This is true for a Job, Plan, and Job and Plan Reference objects. They generate instances because they are triggerable. An instance is the cloning of your object, with all the property settings, so the object does what it has been configured to do. For the Job object, the instance is what is dispatched to the Execution Agent to run.

Let's take a look now at what is involved in general when configuring an ActiveBatch Job. Every Job needs some basic information, which is described here. First, a description of the work to be done is required. This is what we call the payload. This is accomplished by setting the relevant Job properties. Next, we need a system to run the Job on. This is accomplished by associating the Job to what's called a Queue object, where the Queue points to an Execution Agent system. Next, security credentials are required by the agent, because it needs to know what account to run the Job under. This is accomplished by associating the Job to what's called a User Account object, which is how user names and passwords are specified. Finally, we need a way to tell the system when to clone the Job. That is, when to create an instance. When to create a run of that Job. This is done by using a trigger mechanism.

ActiveBatch supports a variety of trigger methods, including a manual trigger, scheduled triggers, or event-driven triggers to name a few ways in which Jobs are kicked off. Okay, switching gears, I am now going to bring up a system that has a full installation of ActiveBatch. I am going to provide you with an overview of the user interface, named AbatConsole. After the overview, we will look at an ActiveBatch use case.

Welcome to my system where I have the Job Scheduler installed. As a friendly reminder, the Scheduler and Execution Agent run as Windows services. Let's take a quick look at that. Here we have the Scheduler, and here's the local agent. The ActiveBatch Console is launched on demand.

So, let me go over to this desktop icon now and double click on it to launch the console application. The start page is the currently active tab in the console. I recently connected to a Job Scheduler with the system name of AsciSystemII, which happens to be the system we're working with now. The reason the client application does not auto-connect to the Job Scheduler is because I do not have auto-connect enabled. Let me show you. If I click on the ActiveBatch menu option and then select connect, it launches the connection manager, displaying the settings for the currently configured connection. Notice auto-connect is not enabled. Since that's the case, the console application did not automatically connect to the local Job Scheduler, therefore, I need to click on the connect button to establish the connection.

But before I do that, let's assume for a moment that I never connected to a Job Scheduler before using this client application. I would have to configure a connection. To do that, I would click on this icon to the far left to create the new connection. I would need to identify the Job Scheduler server to connect to, where using the host name is one way to identify it. There are other ways not described in this presentation. I don't need to enter credentials unless I wanna connect to the Job Scheduler with an active directory User Account that is not the same user that's logged on to this system. When left blank, like in this case, the Windows user's log on credentials are used to connect to the Job Scheduler.

Let me close out of this and not save this connection information, because I already have one that I created earlier. But before I connect to AsciSystemII, notice the resources available to you on the start page, which includes access to specific web pages at the ASCI website, and on the right is a newsfeed that consists of ActiveBatch announcements and news. Towards the bottom of the start page is ActiveBatch version information, help, license information, and system information. And you can also check for product updates, which requires you log in to MyASCI in order to see the latest version of ActiveBatch on the website to download it.

I will connect to my Job Scheduler now by clicking on the AsciSystemII connection that was previously established in a prior session. Notice the connection display name is tabbed in the ActiveBatch Console. I can toggle back and forth between the start page and the Job Scheduler I just connected to. If I have multiple Job Schedulers to connect to, I can do so with a single console session by creating a new connection for the other Job Scheduler, connect to it, whereby when doing so, like this first connection, the second connection will be tabbed in the ActiveBatch console. Moving along, there are four key panes in the console.

First, there is the object navigation pane, where an ActiveBatch developer will be spending time creating and managing ActiveBatch objects. Next, this is the pane shared by views and properties. Please note, there are other selections you can make in the console that are also tabbed in this pane. Next, there is the main view, where by default, the Job Scheduler information view is tabbed in the main view. The main view is updated as selections are made within the console that use this view. Next, there is the instances pane, which displays the instances of a currently selected, triggerable object, or queue in the object navigation pane. At least, that is one key way in which the focus of the instances pane is updated. There are other ways. Like the views pane, the instances pane is also shared by other views that are selected by clicking on the open close panes button you see here.

The panes already open have a check next to them. If you click on a checked pane, it closes it. If you click on an unchecked pane, it opens it. In summary, these are the default panes you will see when initially launching the console. The panes you have opened are saved every time you exit the console. The next time you launch it, the panes you last had open and the location they were in is restored. If you ever want to reset the panes to the factory default settings, click on the tools menu option, then select settings, then click on reset settings, and then click on reset layout.

Let's look at the object navigation pane, oftentimes referred to as the tree. It is where ActiveBatch objects are saved and managed. At the very top of the pane is the Job Scheduler object. This object is created and owned by the system. All objects have properties and actions you can take against the object. To see what they are, right click on the selected object to pull up menu options you can choose from. Let me do that now on the Scheduler object. Notice the properties menu option. It's always at the bottom of the menu list, no matter what object you have selected in the tree. Let me select properties.

The properties of the selected object is tabbed in the main view, like we see here. Each of these tabs are what we call property sheets, which, as I click on them, the information to the right changes to reflect the currently selected sheet. Some of the property sheets like alerts and variables have no values because nothing's been configured yet. To close a document that has been tabbed in the main view, you can click on the X or right click and select a menu option, in this case, I'll just select close.

Now, as stated, the object navigation pane is an area of interest for ActiveBatch developers. They can add ActiveBatch objects to the root of the Job Scheduler, and by root, as a good analogy, think of the root of your C drive on a Windows system. So objects can be added to the root of the Job Scheduler. However, as a best practice, ActiveBatch developers are encouraged to add objects to folders and Plans and not clutter the root with too many objects. Here is how I would add a root level object. I would right click on the Job Scheduler object and then select new, and then pick an object to add.

Notice there are three root level folders in the tree. An ActiveBatch developer created them. I can open them to see what's inside. And as I mentioned just before, when you right click on an object, in this example the FTP Jobs folder, I can look at the properties of this object where the properties are tabbed in the main view, just like the Job Scheduler object was. And I can then click on the folders' property sheets. I'll go ahead and close out the properties of this folder. I could add a new object to the folder with a right click, new, and then select an object to add. If I open this shared objects folder, I see some objects that can be shared with multiple Plans and or Jobs. We have other presentations that offer strategies regarding how to organize your objects in the object navigation pane.

Lastly, the three objects you see here are built in to the product, so an ActiveBatch developer did not create them. They have a special purpose also described in other courses. Next, let me provide you with a little more detail about the views pane. We have many different views. Some are geared more towards ActiveBatch operators, and others are specific to ActiveBatch administrators.

For example, in the operations folder, there are many views that report on instance data. The views report on the data a little bit differently. The way they look and how the data is organized. For example, the daily activity view. Notice it's displayed in the main view. It reports on instance data the same way that the operations view reports on instance data, but they layout is different. How this information is presented is different. These are views for ActiveBatch operators to monitor currently executing instances, to look over the history of past instances, and to see what's coming in the future. If I scroll down a little bit in the views pane, there's an administrator folder. There are views specific to an ActiveBatch administrator, like the configuration view, tabbed in the main view, or the multi-factor authentication view, something an administrator can configure. There are many views with all sorts of functionality that are described in other training videos. For our purposes, we want you to feel comfortable navigating the console.

Let me close out all the views except the Job Scheduler information. To do that, I'll right click on the Job Scheduler tab and close others. Lastly, regarding the default panes that you see when you initially launch the console, we have the instances pane. This pane lists instances for the currently selected queue or triggerable object and returns instance data based on filter criteria set, if any. In other words, you can filter by the type of instance, that is the state the instance is in, and you can also filter by date as well as time when you're selecting the custom date option.

If I expand this Plan, the VendorC Plan, and click on this Job, notice it updated the instances view to show me an instance of this Job that meets this date criteria. If I change the date criteria to show me instances of this Job for the last seven days and then click on the refresh icon, now I see all the runs for that particular Job for the last seven days. So that's the instances pane.

At this time, I am going to introduce a use case, which is as follows. A user would like to DownloadFiles from an FTP Server, then archive the downloaded files. The objects required to do this have already been configured. Let's go to where those Jobs have been configured in the object navigation pane. I'll close this Plan. And I'll open this VendorA Plan. Before we talk about these two Jobs, just a brief mention that the folder object, the FTP Jobs folder, is used to help organize Jobs, Plans, and other objects in a meaningful way within the tree. It works very much like a folder in the Windows operating system.

The shared objects folder was defined by an ActiveBatch developer, and it's being used to store all the objects that can be shared with Jobs and Plans. If a property of one of these shared object changes, it impacts all the Jobs and Plans that are using the object. The key term we use for using a shared object is the Job or Plan is associated to a shared object. So we use the term associated or associations a lot. The shared objects that a Job author can create in ActiveBatch are as follows. If I right click on this folder, and select new, the shared objects include the two different types of queues, and then all the rest of these object types.

Okay, let me get back to the VendorA Plan where our use case has been configured. Notice there are two Jobs in the Plan. They are sorted alphabetically by default. Think of the Plan object as a wrapper around typically two or more Jobs, although you could technically have one Job in a Plan. But typically there are two or more Jobs in a Plan that are somehow related and can optionally be set to run in a specific order. The Jobs could potentially run in parallel or they could, for example, run sequentially. It all depends on property settings that are configured. When a Plan is triggered, a Plan instance is created, and the Jobs in the Plan also have instances created where the Job instances only are sent to the Execution agents to run. The Plan itself does not get dispatched to an agent because it has no payload. The payload is configured in the Jobs. The DownloadFiles Job will run first and if it completes successfully, it will trigger the ArchiveFiles Job. Let me show you a view that graphically depicts the order in which the Jobs will run. It is called the Map View. I'll right click on this Plan and select view, and then map.

The Map View depicts both of the Jobs in the Plan graphically and it also shows that a completion trigger has been set up between the Jobs. Let me zoom to fit. This is the completion trigger that I was just referring to. This is a property that's been set on the DownloadFiles Job. If this Job succeeds, it will then trigger the ArchiveFiles Job. Let's look at the properties of the DownloadFiles Job by right clicking on the Job and selecting properties. Notice the properties sheets are tabbed in the main view. All objects have a general property sheet with a name and label. Let me click on the Jobs Library tab. It's one of our three Job types. Notice the Job type property here and a Job author's two other options. In this case, the Jobs Library was selected and configured. But before we get to the payload of the Job which is this step, let's look at some of the execution properties.

First, there's a submission queue property. This is a required property for every Job. In this example, the Job is associated to an Execution Queue that is stored in the shared objects folder. And here it is, LocalQueue. Many Jobs could be using, that is associated to this queue. Notice AsciSystemII in parentheses. This is the name of the system that has the execution agent installed on it. It is the system where the Job will run.

If I click on the ellipses over here, it will actually pull up the property sheets of LocalQueue. And if I click on properties, this is a required property for the Execution Queue which identifies the machine where the ActiveBatch Execution Agent has been installed. Everything is local here to keep it simple for training purposes. We are running the Job on the local Execution Agent system. The same system that the Job Scheduler has been installed on. So every Job needs a queue. There are other queue types not discussed in this presentation. The Execution Queue, which is what LocalQueue is, identifies the machine where the Job will run.

An ActiveBatch Job author or administrator must create a queue object in order to run any Job in ActiveBatch. Without at least one user-defined Execution Queue, no Job will run in ActiveBatch. You have to have at least one Execution Queue. Okay, let me close out the queue property sheets.

Just below the submission queue property is the User Account property. This also must be set. It's the account that the Execution Agent will run the Job under. Notice the full path to this User Account which points to this object in the navigation pane. If I click on the ellipses, it pulls up the property sheets for that User Account object. And the key property settings are here. The username and password that will be used by the Windows Execution Agent to create this process with and run the Job under. Like all of our shared objects, you could have many Jobs associated to this same User Account object. It's a Windows Active Directory Account. Every Job needs security credentials, no matter what operating system the Job is running on. The way they are identified is through a User Account object. Let me close this out now.

Next, we have the payload of the Job. It's already been configured. The Job's library Job type is very handy in that it comes with templated Job steps for frequently performed tasks, such as accessing databases, sending out emails, uploading and downloading files, like we're doing here now. Just to name a few. There's a lot of functionality in the Jobs Library. We have a single step that's been configured here. You could potentially have multiple steps in any given Jobs Library Job. The DownloadFiles step was taken from the managed file transfer category. Let's make that a little bigger. Let me scroll down.

Here it is. A user had dragged and dropped this step, and then configured the properties. That's already been done for us. The manage file transfer category provides a scriptless approach to managing your file transfer needs. An ActiveBatch Job author configured the properties. For example, what files to download. All the .CSV files in the train folder on the FTP server. And placed them on the Execution Agent system in this local path. A time filter was set to only download files that are older than one hour and that are greater than zero bytes. So we don't download any empty files. The FTP server that we're connecting to has been configured in a special type of User Account object. Notice here, under connection. This is pointing to a shared object in the tree. That's this object. Let's go ahead and take a look at it.

Let's look at the properties. This is a User Account object with a credential type of managed file transfer connection. This is where the FTP server information has been configured. Everything from the FTP server host name, to the User Account that has rights to connect to it, to the protocol being used, and so on. It's all configured here in this managed transfer connection type of User Account object. This is very useful if you have two or more Jobs sharing the same FTP server connection information. Should anything change about the FTP server, you only need to change it in one location, here on this object. Please note that the FTP server is the same name that ActiveBatch is installed on. This is for ease of training purposes. But in the real world, it would likely be on a different server. So this is the DownloadFiles payload. The next Job is the ArchiveFiles Job. Let's look at the properties of this Job. Let's right click and look at the properties.

I'm gonna click on the Jobs Library tab because this Job is also a Jobs Library Job type. Here's the archive file step taken from the file system category. The source files to zip up are the files that were downloaded and placed in this directory. And this is the destination that is the location where to store that zip file. Now the name of the zip file contains a partially hard-coded file name, which is CSVFiles, and a soft-coded portion that is referencing an ActiveBatch variable. The variable is referenced with ${ syntax like you see here with today's date. This variable was defined on the FTP Jobs folder. Let's take a quick look at that variable by bringing up the properties sheets of that folder and clicking on the variables tab. This is the variable that's been defined on that folder that the ArchiveFiles Job is using. This is name of the variable. And the variable is an active variable that's a type of variable in ActiveBatch, using date arithmetic, that's going to return today's date in this format.

We would like the archive file name to include today's date within the root of the file name and how we are accomplishing that is by using an active variable, which is a feature of the ActiveBatch product. Let me go back to the ArchiveFiles Job for just a moment. To point out to you that this Job is running on the same queue and User Account that the DownloadFiles Job is running on. Here's the queue. And User Account. I'm just mentioning this to reinforce the concept of shared objects. These two Jobs are associated with the same queue and User Account.

Now, before I trigger the Plan, the remote files that we're going to download are located in the train folder, the .CSV files. They're going to be placed in the files one directory and then the ArchiveFiles step is going to take those .CSV files and zip them up and place them in this directory with this file name. So if I bring up the file system, this is the train folder. These are the two .CSV files that will be placed in this directory and once the zip file is created, it will be placed here. Okay, let me go back to the Map View. We can watch the Jobs within the Plan execute as they're running. That's one of the features of the Map View.

As an operator, you can actually watch the Jobs run. Let me show you what I mean. I'll go ahead and right click on this Plan, and I'll trigger it. Let me change the instances view to today, refresh. Notice over here in the Map View, the DownloadFiles Job has turned blue, it's executing, it completed successful. Green denotes success. So therefore it triggered the ArchiveFiles Job. Everything ran successfully. If I open up the Plan in the instances pane, I can see the two Jobs here as well. By default, all Jobs generate a log file where standard output and standard errors are captured. To view the log file, I can right click on the instance and view the log.

This is the ActiveBatch Job log for the ArchiveFiles. And here's the log file for the DownloadFiles Job. If I bring back up the file system, I can see the zip file that was created with the two files that were downloaded. And those files were placed in this directory. And let me just bring back up for one second the file name of the zip file. Notice the hard-coded portion followed by the date, which is using date arithmetic active variable to produce this date. Today happens to be September 24th, 2019. Okay, let me minimize that. So there you have it. We triggered a Plan, the two Jobs within the Plan ran on the local Execution Agent. The system to run the Job on is identified by the queue object. That's how the Scheduler connects to the Execution Agent via the queue object and the user to run the Job under is determined by the User Account object. That's another required object for every Job that you create. From there the payload is configured, and then you can assign a trigger mechanism. In this case, to the Plan. I manually triggered it, but there's lots of different ways that you can trigger Jobs in ActiveBatch. The schedule, or event-driven triggers, and so on.

Okay, let's go back to the presentation, just to summarize what we've learned. Welcome back to the presentation. Let's summarize things. We have the three key ActiveBatch components. There's the client, named AbatConsole. That's the user interface used to interact with the Job Scheduler. You just saw me working with it. Next, there's the Job Scheduler component that dispatches Jobs to agents, updates the backend database, and updates console users with Job statuses. Lastly, there's the Execution Agent, and that's the system where Jobs run. In my use case, everything was local on a single system, but that's for ease of training purposes. In the real world, you would have remote client users connecting to the system where you've installed the Job Scheduler and the Job Scheduler would be dispatching Jobs to typically multiple remote agents.

This is a list of the ActiveBatch objects that come with the base license. We'll highlight the ones that we looked at. There's the ActiveBatch folder object that's used for organizational purposes. There are four objects that are triggerable, and we looked at the Job and Plan objects. Jobs get dispatched to agents, and multiple Jobs can be wrapped in a Plan where the intent is to trigger the Plan. Please note, you could have a standalone Job that triggers and it could just be in a folder. It's not required that you place Jobs in a Plan. The remaining objects are shared, meaning they can be associated to more than one Job or Plan. We learned that every Job must be associated to a queue. It could be an Execution Queue, which we discussed in this presentation, or a Generic Queue, something you can learn about in a different video. We also learned that every Job needs to be associated to a User Account object. That's the account that the agent will create and run the process with.

Let's review some terms. There's the Job object, which is the Job definition, or the template, and they are defined and stored in the Console's Object Navigation Pane view. The Job instance is the Job object that's been cloned. The instance is sent to the agent to run. Next, when a trigger occurs, an instance is created. It could be a manual trigger, an automated trigger, and when the trigger occurs, the object is instantiated, an instance is created. Now, it might not run right away, perhaps the queue is off line for unknown reasons, or perhaps there are pre-conditions that need to be satisfied before the Job can run. We have different types of what we call Job constraints that can be configured that need to be met before a Job can be dispatched. But when a trigger occurs, it will create an instance.

Let's look at this final page of terms. There are object and instance properties. These are the things that are, for example, the payload to run and the machine to dispatch the Job to. There are object and instance methods. These are the things that result in an action. For example, you can pause an instance, or copy and paste a Job object, or trigger a Plan. Okay, you should now have a general understanding of key ActiveBatch terms and concepts.

ASCI offers a variety of training courses to expand and optimize the development, administration, and operation of your ActiveBatch environment. We look forward to providing you with courses that consider your experience level, ActiveBatch roles, and learning preferences.

Thank you for watching this presentation, and have a good day.

Products
ActiveBatch
XLNT Scripting Language
RemoteSHADOW Replication
OpenVMS Products
Company
About Us
Contact Us
Directions
Careers
Business Hours
Observed Holidays
Partners
Privacy Policy
Resources
eBooks & White Papers
Case Studies
Videos
Webinars
Datasheets
My ASCI
Account Info 
Support Info 
Product Info
Knowledgebase
Training
Professional Services
Contact Us
+1 (973) 539-2660

Email:

ABOUT SSL CERTIFICATES

© 2019 - Advanced Systems Concepts, Inc.  All Rights Reserved.
1180 Headquarters Plaza
W. Tower - 4th Floor
Morristown, NJ  07960
+1 (973) 539-2660 info@advsyscon.com