Skip to content

Inaport Logging for CRM Integration and Migration

 

This post provides a detailed description of how to use Inaport’s powerful and extensive job logging functions for successful CRM integration and migration projects.

Many different types of job runs can be involved with a CRM migration, CRM integration, and when running a proof of concept. You may be transferring ten rows of data, or millions; you may have scheduled one job to run overnight, or hundreds. Whatever the scale, you need to be sure that everything has run correctly, and if there’s a problem that it’s been automatically flagged and is easy to track. This is where Inaport logging comes in.

Inaport logging lets you create log files to record overall statistics, as well as detailed information, for each job or batch of jobs . With Inaport logging activated you can:

  • check that jobs have run as scheduled
  • check individual jobs for errors
  • trace data as it flows through the integration process
  • notify administrators of errors on an exception basis

Inaport has standard logging functions for everyday job management, however it also has more powerful logging functions for complex and challenging situations. This post is broken into two parts to deal with each type separately:

Part 1 describes the standard log capabilities of Inaport and how they can be used as part of the daily job management process. It shows how to customize the naming of log files, and how to include the user’s content.

Part 2 describes how configuration of the logging system can be modified to handle complex environments. It shows how to extend the logging system with the user’s own log files; how to log to databases; and how to log to multiple targets at the same time.

All functionality described can be found in Inaport Professional and Enterprise Editions, and is available to new and existing users (with current Software Assurance) by downloading the latest version of Inaport Software from our website.

Part 1 – Standard Inaport logging capabilities

Setting up a log file for an Inaport job

Setting up a log file for a particular job is very easy, and it’s strongly recommend that it be done for every profile. By default the log file will contain:

  • Management information such as the profile run, start and stop time.
  • Overall statistics for the run – how many records processed, how many match or match fail, how many written.
  • Detailed statistics for each entity or table in the target – how many rows inserted/updated/deleted, how many errors.
  • Optionally, if source records have data errors or match problems, the row can be added to the log file.

To set up a file, simply enable logging on tab 6, and specify the location of the file:

Inaport Logging setup

Inaport Logging Setup

As well as specifying the log file, you can optionally:

  • Write Start and Complete to System Application Log
  • Send an email when the job is complete. The email will have the overall job statistcs by default, and you can optionally attached the full log file.
Naming the log file using expressions

The log file path can be just a fixed file name; for example:

C:Inaport logsMyLogFile.txt

However, the log file name can actually be a full Inaport expression. In the screen shot above, the log file name is:

C:Inaport_logs#P #D_#T.log

This is the recommended name form. The components “#P #D_#T” are replaced with respectively:

  • #P – the full name of the profile
  • #D – the current date, in the format ‘yyyyMMdd’. For example: 20151004 for October 4, 2015
  • #T – the current time, in the format ‘THHmmss’ For example: ‘T143021’ for 2:30 and 21 seconds PM

The net result is that every log file has the full name of the profile and the date/time the job ran. This makes it straightforward to scan a log directory for a particular job run.

You are not, however, restricted to this format. The name text can be built up using Inaport expressions, such as nowstr() which can be used to format a date/time however you wish:

Log Path Expression

As a result of being able to include expressions in the file name, Inaport users can make sure that files are automatically named in a way that makes identification and sorting easy.

Customizing the content of Inaport log files

By default, the standard log file will have:

  • Overall statistics for the run – number of records processed, how many resulted in inserts or updates on the target, start/stop time and overall time
  • Optionally, records that had data errors or match fail errors
  • Detailed statistics for each table targeted – how many matches or match fails, how many inserts/updates/deletes.

For many purposes, the standard information is sufficient. However, there are some scenarios where it may be important to add more information to the log.

For example, suppose you are importing leads from a marketing automation system. As you import the leads, it may be important to know specifically which leads are new to the system, and which already existed and so update existing data. In this case, it can be useful to modify the standard Inaport log files.

Inaport provides the log() function, that allows you to write your own information to the log file. For example:

log("Write this to log file")

will cause the text to be written to the log file, in the area where information about record errors is written.

Although this feature used in isolation has limited use, it can be combined with Inaport’s ability to execute functions after a record is updated, with information about the update:

LogUpdateAfterTableOp

Here you can see the PostOperation tab being used to specify a couple of expressions.

  1. The first row says “If the record was inserted, write “Lead: xxx@yyy.com was inserted” to the log file
  2. The second row says “If the record was updated, write “Lead: xxx@yyy.com was updated” to the log file..

This is how the results look in the log file for some sample data:

Log Contents

Using this approach, you can can see the progress of leads through multiple jobs. The next section discusses the implications of better lead management in closer detail using Inaport’s more powerful logging functions.

Part 2 – Advanced Inaport logging

The standard capabilities available in the Inaport logging system are more than adequate to meet many requirements. However, with more complex integration environments, additional capabilities may be required to assist in management of job runs.

This section describes how to achieve the following:

  • Configure the logging system for different types of logging targets – files and database
  • Scenario 1 – Unified logging – having multiple jobs writing to a single log file, and why
  • Scenario 2 – Database logging – having log messages sent to a database

Configuring the logging system for different types of logging targets

Inaport uses an XML file to configure the logging system – LoggingConfig.xml. By default it is installed in the Inaport program directory. The configuration process consists of modifying sections of this configuration file, using a text editor.

The file is set up in two parts – one to setup and configure Appenders (Targets), and the second to configure Loggers, which bundle together one or more targets.:

  • Appenders, or logging targets – Specify the details of where a log message can be written. There are several different types of Appender available. We will show setting up a file, a rolling file, and a database table. Other possible types include email and the System Application Log.
  • Loggers – A package of targets that Inaport can write to in a single operation.

Once you have the Appenders and Loggers configured, you can write to any logger using this Inaport function:

logger("name of logger", "your log message")

Configuring an Appender (Target)

log4net provides several different Appender types. See the Apache web site (https://logging.apache.org/log4net/release/config-examples.html) for full documentation on configuring Appenders. In this section, configuring the FileAppender, RollingFileAppender, and Database are shown.

FileAppender

Specifies a file to be written to, and the format of the log message. The XML configuration will look like this:

<appender name="MyFileAppender" type="log4net.Appender.FileAppender">
  <file value="c:\inaplex\logs\iplog.log"/>
  <!-- if you need to share this file with another process, 
        uncomment the next line -->
   <!-- <lockingModel type="log4net.Appender.FileAppender+MinimalLock" /> --> 
   <encoding value="UTF-8"/>
   <appendToFile value="true"/>
   <layout type="log4net.Layout.PatternLayout">
       <conversionPattern value="%date{ISO8601} [%thread] %logger{2}: 
          %message%newline"/>
   </layout>
</appender>

Configuration elements include:

  • file – allows you to specify the full path to the log file.
  • appendToFile – if set true will ensure that all log messages are added to the end of the existing file; if set false, a new file will be created for each job run.
  • layout – specifies how the log message will be formatted. In the example above, the portion “%message” specifies the content of the message you send to the logging system. The other parameters specify the current date, the thread running, and the name of the logger that was used to write the message. Full documentation about the various parameters for the layout is here.

As a convenience, the LoggingConfig file has a FileAppender called “UserLoggerFile” already configured for you. This is used by the “UserLogger” logger discussed below. You can simply edit the Appender to specify where you want the log file.

RollingFileAppender

It is possible for logging output to become quite large. An advance on the FileAppender is to use a RollingFileAppender. This allows you to specify that when a log file reaches a certain size, it is archived and a new log file started. You can also specify the maximum number of archive files to keep.

The XML configuration will look something like this:

<appender name="MyRollingFileAppender" type="log4net.Appender.RollingFileAppender">
     <file value="c:\inaplex\logs\rollinglog.log"/>     
     <appendToFile value="true"/>
      <encoding value="UTF-8"/>
     <maximumFileSize value="1000KB"/> <!-- How large the log file will get -->
     <maxSizeRollBackups value="5"/> <!-- How many older log files to keep -->
     <layout type="log4net.Layout.PatternLayout">
       <conversionPattern value="%date{ISO8601} [%thread] %logger{2}: 
           %message%newline"/>
     </layout>
</appender>

Some of the configuration elements are the same as the FileAppender. The new ones are:

  • maximumFileSize – specify how large the log file is allowed to get
  • maxSizeRollBackups – how many archived log files to keep, before they are deleted.
AdoNetAppender

The AdoNetAppender allows you to target most types of database – SQL Server, Oracle, Access, or others. Logging messages are written to a specified table in the database. As with the other Appenders, there are a range of parameters that can be written in addition to the base log message.

Before log information can be written, a table in the target database must be created. Here is sample SQL to create a basic log table:

CREATE TABLE [dbo].[Log] ( 
  [ID] [int] IDENTITY (1, 1) NOT NULL ,
  [Date] [datetime] NOT NULL ,
  [Logger] [varchar] (255) NOT NULL ,
  [Message] [varchar] (4000) NOT NULL 
) ON [PRIMARY]

And here is the XML configuration of a an AdoNetAppender that would log to this table:

<appender name="MySQLAppender" type="log4net.Appender.AdoNetAppender">
  <connectionType value="System.Data.SqlClient.SqlConnection, 
        System.Data, Version=1.0.3300.0, Culture=neutral, 
        PublicKeyToken=b77a5c561934e089" />
    <connectionString value="data source=SQLSVR;initial catalog=test_log4net;
        integrated security=false; persist security info=True;
        User ID=sa;Password=sa" />
  <commandText value="INSERT INTO Log ([Date],[Logger],[Message]) 
        VALUES (@log_date, @logger, @message)" />
  <parameter>
    <parameterName value="@log_date" />
    <dbType value="DateTime" />
    <layout type="log4net.Layout.PatternLayout" 
        value="%date{yyyy'-'MM'-'dd HH':'mm':'ss'.'fff}" />
  </parameter>
  <parameter>
    <parameterName value="@logger" />
    <dbType value="String" />
    <size value="255" />
    <layout type="log4net.Layout.PatternLayout" value="%logger" />
  </parameter>
  <parameter>
    <parameterName value="@message" />
    <dbType value="String" />
    <size value="4000" />
    <layout type="log4net.Layout.PatternLayout" value="%message" />
  </parameter>
</appender>

For obvious reasons, the elements of the configuration are different to the file setup:

  • connectionType – specify the type of database – in this case, SQL Server
  • connectionString – specify the connection string to be used to connect. For more information about connection strings for different databases, see: ConnectionStrings.com.
  • commandText – specify a SQL insert statement that is used to write to the log table. It is also possible to use stored procedures.
  • parameter – the SQL insert statement is parameterized. Each of the parameter values is specified using a parameter configuration element.

Configuring a Logger

The Appenders specify the nuts and bolts of where log messages can be written – possible targets for log messages.

The Logger specifies where the log message will be written. It also defines a name that you can access using the logger() function in Inaport.

The Inaport LoggingConfig.xml file already has some loggers defined, which you should not modify.

  • user – this is used by Inaport to create the standard log file discussed in Section 1
  • sysevent – used by Inaport to write to the System Application log if you request it in the logging setup tab
  • debug – used by Inaport you set up debug logging using the Tools – Start Debug logging option.
  • UserLogger – this the default logger for the logger() function, and uses the UserLoggerFile appender noted above. You are free to modify this logger.

You can add your own loggers to the configuration. For example, suppose you wanted to be able to log to a RollingLogFile, and a database. You might configure a logger like this:

<logger name="MyLogger">
    <level value="INFO"/>
    <appender-ref ref="MyRollingFileAppender" />
    <appender-ref ref="MySQLAppender" />
</logger>

You would then be able to write to this logger using the logger function:

logger("MyLogger", "some important message to log")

Because you can use this function anywhere in Inaport, you can decide exactly what information is to be logged. If you set up more than one logger, you can write the same or different information to each logger.

Scenario 1 – using a unified log file for data tracing

The individual log files produced by an Inaport profile are very useful for understanding action in detail, but as the number of scheduled jobs grows so does the number of files produced daily. This can make it difficult to see “the wood for the trees”.

An Inaport customer was faced with this situation when pulling leads from the Eloqua marketing automation system for multiple business units, into a CRM system. Each business unit has their own criteria for lead handling and the jobs are run hourly, resulting in over 50 log files each day. When leads are not flowing in the expected manner, it is obviously critical for the business to identify the issue as quickly as possible, whether it is a problem with the integration, or an issue with lead routing in Eloqua.

The solution adopted was to add a RollingLogFileAppender and a Logger to the to configuration, and have all profiles write to the unified logger when a lead is inserted or updated in CRM. The logger() call is in the PostOperation tab of the profile.

UnifiedLogging

The result is that each day (in addition to the detailed log files) there is a single file, that shows in a straight forward way what has happened with every lead processed through the integration system.

Scenario 2 – log job runs to a database

In this example, the customer had a requirement to log all runs of their jobs to a database, to ensure that the scheduling system was working correctly. As in scenario 1, they wanted to continue using the individual log, as well.

The solution was to add an AdoNetAppender and a DatabaseLogger to the LoggingConfiguration.

Then in the Pre/Post group on the Tab 1 of the profile, logger() expressions were added to indicate when the job started and when it completed:

DatabaseLogger

The result is that there is an entry in the database table for every job, indicating when it started and when it completed.

Conclusion

The Inaport logging system is designed to be very simple to use, and for most requirements “it just works” as part of the Inaport job run process.

Integration projects can become very complex, however, and the logging system can be tailored to meet those complexities and make managing the overall process much simpler and les error prone.

If you have any questions about configuring logging to meet your requirements, please do not hesitate to complete a Contact Request form to book a free, no obligation webinar or request a call from one of our consultants. You may also download an Inaport Evaluation license at no charge from our website.

Inaport, from InaPlex, delivers CRM integration and data migration solutions for leading CRM systems including Salesforce, Microsoft Dynamics CRM, Sage CRM, Infor CRM (previously Saleslogix), Goldmine and ACT. It has a range of connectors and maps for quick results, and its straightforward approach provides a wealth of powerful functions. To learn more please book a free webinar, download a free 30 day evaluation license, or visit the InaPlex website.

No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: