Month: October 2017

Common Implementations – GL Imports

Here is the code snippet:

public void Import(IEnumerable lines, DateTime postingDate)
	// TODO: To increase efficiency, comment out any unused DB links.
	using (HuAPConnection Connection = new HuAPConnection())
	using (HuView GLBCTL = Connection.GetView("GL0008"))
	using (HuView GLJEH = Connection.GetView("GL0006"))
	using (HuView GLJED = Connection.GetView("GL0010"))
	using (HuView GLJEDO = Connection.GetView("GL0402"))

		GLBCTL["PROCESSCMD"] = "1";  // Lock Batch Switch

		GLJEH["BTCHENTRY"] = "";  // Entry Number
		GLJEH["BTCHENTRY"] = "00000";  // Entry Number
		GLJEH["SRCETYPE"] = "AP";  // Source Type
		GLJEH["DATEENTRY"] = postingDate;

		foreach (DataLine line in lines)
			GLJED["ACCTID"] = line.GLAccount;  // Account Number
			GLJED["PROCESSCMD"] = "0";  // Process switches
			GLJED["SCURNAMT"] = line.Amount;  // Source Currency Amount



HLBCTL is journal entry batches

GLJEH is the header for the journal entry

GLJED is the line items for the journal

Dynamic Fields in Views (1.0.5 feature)

HuLib has made Sage interactions much nicer using an indexer to get fields from views rather than awkward functions but there are still some annoyances. It would be nice if we didn’t always have to cast from an object, even when we are setting it to a variable of the same type.

string vendor = apibh["IDVEND"].ToString();
DateTime date = (DateTime)apibh["DATEINVC"];
decimal amount = apibh["AMTUNDISTR"];

apibh["AMTUNDISTR"] = amount;

Now HuLib has an alternative (though the above code is still perfectly valid. There is now a dynamic object called Fields which allows you to get fields out without bothering with the type casting:

string vendor = apibh.Fields.IDVEND;
DateTime date = apibh.Fields.DATEINVC;
decimal amount = apibh.Fields.AMTUNDISTR;

apibh.Fields.AMTUNDISTR = amount; // Assignment works too

The above code leverages the dynamic type functionality that was added to C# some years ago. There is no intellisense for this type and no compiler errors or warnings for whatever you add to it:

apibh.Fields.THISDOESNTEXIST; // this does not cause a compiler error

Note that this means the above code does not generate compiler errors but it can cause runtime errors. In addition, if you try and assign a string to an int, this will likewise cause a runtime error.

This is just a convenience extension to views for if you don’t feel like using indexers and casting. The implementation will function exactly the same. (Note that you cannot use put without verification with Fields).

Common Implementation – Optional Field Mapping

It is quite common that we use optional fields in Sage to store key-value pairs. The way this works in Sage is the Value column of an optional field is considered the key and the description is considered the value.

Optional Fields

To make interaction with this even easier with HuLib, include HuLib.AccPac.Client, it includes an overload for Connection that can easily pull all this data into a dictionary that is ready to use:

Dictionary<string, string> custVendorLookup = Connection.GetOptFieldMap("CUSTTOSTORE");
string vendor = custVendorLookup["1100"]; // 0083

Feel free to use this to lookup values from Sage without creating views manually!

Reading CSV files

With HuLib, reading CSV files is even easier than it was in the first place. There is a class in HuLib called CSVFile. You can instantiate it with the name of the file you want to read. From there you can call load to populate the structure and read out values from this structure, OR use the mapping framework:

public IEnumerable GetLines(string fileName)
	CSVFile file = new CSVFile(fileName);
        List mightyLines = file.CreateClasses();

	return mightyLines;

This code creates a list of MightyLine objects (which has Mapping attributes):

public class MightyLine
	public int Cust { get; set; }

	public string CustomerName { get; set; }

	public string Range { get; set; }

	public string PartNumber { get; set; }

The number, in this case, represents the column in the CSV file, that is how the data will be pulled.

To make things even easier I have created an internal tool that will figure this out for you! http://devsrv/ViewInfo2/CSV

CSV Tool

Simply enter the name of the class you want to be generated, select an example CSV file you will be loading from and hit Generate.

On the left, you have the model that will store the rows. On the right, you have code for a function that reads in a file to generate these objects. It even tries to figure out the type that you would want for the data (though be sure to verify it chose the correct one).


Happy Coding!

Importing Receipts into Sage Using HuLib

Here is another code dump, this time for importing Receipts into AccPac.

public void ImportReceipts(IEnumerable<ReceiptLine> receipts, string bankCode, DateTime receiptDate, string batchDescription)
	using (HuView ARBTA = Connection.GetView("AR0041"))
	using (HuView ARTCR = Connection.GetView("AR0042"))
	using (HuView ARTCP = Connection.GetView("AR0044"))
	using (HuView ARTCU = Connection.GetView("AR0045"))
	using (HuView ARTCN = Connection.GetView("AR0043"))
	using (HuView ARPOOP = Connection.GetView("AR0061"))
	using (HuView ARTCRO = Connection.GetView("AR0406"))
	using (HuView ARTCC = Connection.GetView("AR0170"))
		using (HuView ARPYPT = Connection.GetView("AR0049"))
			ARBTA["CODEPYMTYP"] = "CA";  // Batch Type
			ARTCR["CODEPYMTYP"] = "CA";  // Batch Type
			ARTCN["CODEPAYM"] = "CA";  // Batch Type
			ARTCP["CODEPAYM"] = "CA";  // Batch Type
			ARTCU["CODEPAYM"] = "CA";  // Batch Type
			ARPOOP["PAYMTYPE"] = "CA";  // Batch Type
			ARBTA["CODEPYMTYP"] = "CA";  // Batch Type
			ARBTA["CNTBTCH"] = "0";  // Batch Number
			ARBTA["BATCHDESC"] = batchDescription;
			ARBTA["IDBANK"] = bankCode;
			ARBTA["PROCESSCMD"] = "2";  // Process Command

			foreach (IGrouping<string, ReceiptLine> receiptLine in receipts.GroupBy(line => line.FleetID))
				ARTCR["IDCUST"] = receiptLine.Key;  // Customer Number
				ARTCR["PROCESSCMD"] = "0";  // Process Command Code

				ARTCR["DATERMIT"] = receiptDate;


				foreach (ReceiptLine line in receiptLine)
					ARTCP["IDINVC"] = line.Invoice + "-" + line.Store.PadLeft(4, '0'); ;  // Document Number
					ARTCP["AMTPAYM"] = line.EntityCredit * -1;
					ARTCP["AMTERNDISC"] = line.Discount + line.ProcessingFee;

				ARTCP["CNTLINE"] = "-1";  // Line Number
				decimal unapplied = (decimal)ARTCR["REMUNAPL"];
				ARTCR["AMTRMIT"] = unapplied* -1;  // Bank Receipt Amount
				ARTCP["CNTLINE"] = "-1";  // Line Number
				ARTCP["CNTLINE"] = "-1";  // Line Number



ARBTA is the receipt batch, we are creating one of these

ARTCR is the Receipt view, we are creating one per customer, multiple lines will be in a single invoice if they share the same customer

ARTCP is the Receipt Detail view, there is one per line here

After the line items are added we set AMTRMIT to the unapplied amount multiplied by negative one. This sets the receipt amount to match what we have set in the detail lines.

Importing AR Invoices into AccPac using HuLib

Dumping some sample HuLib code for creating AR Invoices in Sage. This is copied from one of our projects so it will have to be modified for whatever project you want to use the code for.

public void ImportInvoices(IEnumerable<InvoiceLine> lines)
	using (HuView ARIBC = Connection.GetView("AR0031"))
	using (HuView ARIBH = Connection.GetView("AR0032"))
	using (HuView ARIBD = Connection.GetView("AR0033"))
	using (HuView ARIBS = Connection.GetView("AR0034"))
	using (HuView ARIBHO = Connection.GetView("AR0402"))
	using (HuView ARIBDO = Connection.GetView("AR0401"))
		ARIBC.Browse("((BTCHSTTS = 1) OR (BTCHSTTS = 7))");
		using (HuView ARIVPT = Connection.GetView("AR0048"))
			ARIBC["PROCESSCMD"] = "1";  // Process Command

			foreach (IGrouping<string, InvoiceLine> invoiceLine in lines.GroupBy(line => line.Invoice))
				ARIBH["IDCUST"] = invoiceLine.First().BID;  // Customer Number
				ARIBH["PROCESSCMD"] = "4";  // Process Command
				ARIBH["INVCTYPE"] = "2";  // Invoice Type
				ARIBH["IDINVC"] = invoiceLine.First().Invoice + "-" + invoiceLine.First().Store.PadLeft(4, '0');  // Document Number
				ARIBH["DATEINVC"] = invoiceLine.First().InvDate;


				ARIBD["PROCESSCMD"] = "0";  // Process Command Code

				foreach (InvoiceLine line in invoiceLine)
					ARIBD["PROCESSCMD"] = "0";  // Process Command Code
					ARIBD["IDDIST"] = GetDistributionCode();  // Distribution Code
					ARIBD["AMTEXTN"] = line.grosssales - line.promototal;  // Extended Amount w/ TIP



Say we are given a collection of invoice detail lines as input.

ARIBC is the batch view, we want to create a new batch to put all the invoices in.

ARIBH is the invoice view, we want to group all of the detail lines by the invoice number and create a new invoice for each.

ARIBD is the invoice detail line view, we want a new detail line for each line in the data that we are importing.

Reading Excel Files

There are multiple methods in HuLib for reading excel files. The latest addition is the BufferedExcelReader. BufferedExcelReader is meant to replace older methods as it is much much more performant. It achieves this by grabbing all excel data in bulk at the beginning and then providing methods to interact with the reader directly to get data out.

Creating an instance

To create an instance, just call the constructor with the filename:
BufferedExcelReader reader = new BufferedExcelReader(fileName);
Now you are ready to read from it! If you want to switch the sheet you are reading you can use:
Note that initially, the sheet will be the active workbook sheet.

Reading data directly

There are two main ways to do the reading.

1. Using the reader

This is convenient when column numbers can change and you want to read data row by row and not have to keep track of the position manually:
string[] readLine = reader.ReadStrings();
object[] readData = reader.ReadLine();
ReadStrings() will convert everything to a string while ReadLine() will give you the raw types. Each time you call this the program will jump to the next line so you just have to deal with the data that comes out of it.

2. Use the sheet directly

If you want to use direct locations then you can use the sheet directly:
BufferedExcelReader.Sheet sheet = reader.SelectedSheet;
object read = sheet.Read("B8");
read = sheet.Read("B", 8);
read = sheet.Read(8, 2);
These 3 methods return the same thing, note that the third call is using the more common order of arguments where row is first (looks reversed from the first 2 methods).

Mapping with BufferedExcelReader

The quickest way to read tables off of Excel is to take advantage of the Mapping framework.
To use the mapping framework we need 2 things: A model and an object that implements the mappable interface. In this case a Sheet is a mappable object so we can use it to generate classes for us.

1. The Model

The model is just a class with properties, the only difference is we must define the Mapping attribute on each property to identify the column that will contain the data.
public class ReceiptLine
 public string Store { get; set; }
 public string Invoice { get; set; }
 public string Date { get; set; }
You can see that the argument for the Mapping attribute is the column that the data will be pulled from.

2. The Mappable Object

Now we want to populate a collection of ReceiptLines from an excel file, this is trivial:
BufferedExcelReader reader = new BufferedExcelReader(fileName);
BufferedExcelMap bufferedExcelMap = reader.SelectedSheet.GetMap(9);
List receiptLines = bufferedExcelMap.CreateClasses();
First we create the reader, then we use the selected sheet and call GetMap – this returns an object that we use for mapping, the argument 9 is the row where the data starts (not the header). In this example A9 will be the start of the data. Using this BufferedExcelMap object we just use CreateClasses() to create all the rows of data as we would from other mappable objects.
Another example:
public IEnumerable GetReceiptLines(string fileName)
	BufferedExcelReader reader = new BufferedExcelReader(fileName);
	BufferedExcelReader.Sheet sheet = reader.Sheets.First(s => s.Name == "Invoice Summary");
	BufferedExcelMap bufferedExcelMap = sheet.GetMap(7);
	List fleetLines = bufferedExcelMap.CreateClasses();

	return fleetLines;



Comparing this to using the older excel method that did not preload data on a file that had only about 200 rows:
Old method: 49 seconds
New method: 2 seconds – note that now the time is spent in the constructor, the mapping part is very quick
The difference is entirely due to reducing the number of interop calls into excel by fetching all data at the start.

Project Development Walkthrough


This is merely to get an idea of how to go about thinking with projects. Many steps are skipped or ignored for the sake of simplicity (eg many minor things in ticket for whats being imported aren’t important for the general idea).

1. Take a look at the ticket

2. Clone the repo

Find the git url from (can find it through mapper if needed) in this case it is Now clone the repo, the way I approach this is to have a network directory with all of my git repos so that I can access it from any VM. For naming I begin with the project number and some relevant name (658 CB Squared AR Import) this makes it easy to find it in the directory.

3. Create the project

Create a new project using the latest Hutility AccPac Template (in this case it is Hutility AccPac Template 2017 10 11). Set the location to be within your cloned repo in a Source folder. Initially it may look like there are errors, build the project so that NuGet packages are downloaded.

4. Plan out the project structure

You can jot down ideas or go straight to putting together some of the below components if it is straight forward enough. The idea is to separate concerns so that each piece has an obvious single function. The more convoluted a class is, the more difficult it is to test, update or even read. Horizontal bloat is much better than vertical bloat (make more short classes instead of less large classes).

4.1 Models

Think about the requirements and identify the models you will need. Rather than pulling information and using it right away it is often preferable to populate the data, store it in a class (model) and then process it in another location. This way the processing/import functionality is separated from the populating functionality which often should not intertwine directly.
The reason this is preferable to combining the population and import functionality together is it is much easier to understand, test and debug. For example – if we have a Pull Invoices function that creates a list of Invoice Line classes and an Import Invoice function that accepts them. We can easily see after the Pull Invoices call if it pulled correct data. Likewise inside the Import Invoice command we can see the populated Invoice Line classes before the import and what happens after. Without the intermediate step of storing the data in model classes you would have a lot more difficulty tracing issues.
Back to our example. At first glance there is one obvious model we need. The query returns a list of invoice detail lines. We can store each line as its own model. I created a folder called Models and added a public class called InvoiceLine, the properties of which are the different columns that come out of the query.

4.2 Services

Now that we have the models, think about all the interactions we will need to do eg: importing, exporting, converting. Depending on what these actions are doing you may create one more more services. A service will be a collection of methods within a certain domain. The template starts with SageService, this is built into the template since the template is normally used with Sage interactions in mind and some other portions of the template need to know what connection we are using.
For this project it looks like we will also be querying another database/table. It would make sense for a service separate from Sage to handle this. To deal with these operations I added a new class to the Services folder called EntityService. I will use this to perform the query and return a list of InvoiceLines. For importing into Sage it is easy enough to add a new function to SageService that will take InvoiceLines and perform the import.

4.3 User Interface

For every project there must be at least some user interface component, even if it is as simple as a button. For more complicated projects you may have to create more views, in cases like this you may want to create more view model classes if it makes sense in the hierarchy. This can be a complicated topic so I will leave complicated implementations for another post.

4.4 How they fit together

So now we have a service that can create InvoiceLines and a service that can import them into Sage but we need to combine the two for the program to be useful. Usually this will be the ViewModel (the primary one or new ones you create) that ties them together.

5. Implementation

5.1 InvoiceLine

This class is very simple it is just one property per column created from the query. We do add one thing which will come into use later however. Add the Mappable attribute to the class. (it is from HuLib)

5.2 EntityService

The first thing to note is that we will need sql information for the import so we can add this as constructor arguments (it doesn’t make sense to do anything in this service without a connection). Following that we w ill need the function to get the invoice data.
public EntityService(string server, string database, string user, string pass)
public IEnumerable GetInvoiceData(DateTime dateFrom, DateTime dateTo)
Since a single connection would likely be used for the lifespan of the EntityService, the constructor just creates the connection using the parameters. To make it simpler another HuLib function is used:
_connection = SQLUtility.CreateConnection(_server, _user, _pass, _database);
To populate the invoice data we simply populate a list of invoice lines using another hulib function:
IEnumerable invoiceLines = _connection.FillClasses(query);
What this function does is for each property in the InvoiceLine class it will find a column with the same name and pull its value. It does this for each record pulled by the query to give you a list of populated objects. The Mappable attribute from 5.1 is used to indicate that all properties can be mapped this way (you can also use Mapping attribute on individual fields instead).
Great now we have an EntityService class that is able to pull the data we need.

5.3 Wiring up EntityService properties (Config and UI)

Now would be a good time to write tests for EntityService if we were going to do so. What I do at this point is wire up the UI components including the Import button to pull data from EntityService but not actually do the import. The reason for this is so I can verify that the pulling data portion works before writing the import logic. That way we can identify that at least part of the application is working and if there are any architecture changes that are needed for this half we can do them without having to rework the import logic.
This is the benefit of separating the pulling data and import logic, we can use each part on its own without effecting the other.
Now note that to use EntityService we needed the connection information. It would be nice if this could persist between executions of our application. This is where the Config class comes in handy. Properties added to this class will persist between executions (they are loaded on startup, saved when the application closes).
4 properties are added to Config: Server, Database, User, Password. The first 3 of these are simply strings. The 4th a type is used called EncryptedString, this is because we do not want to store the password in plaintext (a security no-no). EncryptedString is serializable, is encrypted in memory and can actually be implicitly casted into string (and used in place of a string as a string argument) this makes it very easy to use (in most cases you can use it as if it was a normal string).
public string Server { get; set; }
public string Database { get; set; }
public string User { get; set; }
public EncryptedString Password { get; set; }
Now we need to set these within the UI. These settings will likely be changed very infrequently, so storing them on the main window would be overkill. Luckily the newest template has another view we can use – the ConfigurationWindow.xaml window will be where we put these properties.
Looking at the ConfigurationWindow, find where there is a commented out password box and its associated label, this is an example of adding to the configuration section of this screen. For each property add a label and textbox pair, except for the password, which will use the BindablePasswordBox control.

Now that our configuration window is set up, lets allow it to be shown from the main window. Open up MainWindow.xaml and uncomment the line that shows it within the tools menu:
We should now be able to run the program, open Tools>Edit Configuration, change values, close the program, reopen it and the settings will still be there. (Note you must properly close the program, not terminate via Visual Studio)

5.4 Wiring up the button for EntityService

Now for the button. The button will need to be connected to a command, in this case it will likely  be a longer running command. To make it a better experience to the user it would be useful to have the application indicate it is processing. Luckily the template makes this much easier as well.
Lets start with the ViewModel.cs class. Create the import function:
public void Import()
   EntityService entityService = new EntityService(Config.Server, Config.Database, Config.User, Config.Password);
   IEnumerable invoiceLines = entityService.GetInvoiceData(DateFrom, DateTo);
Note that DateFrom and DateTo were added to the view model as public properties.
Declare a new command in ViewModel called ImportCommand.
public ICommand ImportCommand { get; }
The command does not yet do anything as we will set it up in the constructor. In this case we do not want to use a RelayCommand (which is used for a blocking operation that would freeze the UI). We should use an AsyncCommand, these require many parameters so to make it easier we can use an AsyncCommand factory. Within the constructor find the commented line
AsyncCommandFactory commandFactory = new AsyncCommandFactory(StateManager, "Loading....");

Uncomment this and use it to create the new asynchronous command:

AsyncCommandFactory commandFactory = new AsyncCommandFactory(StateManager, "Loading....");
ImportCommand = commandFactory.Create(Import);
Thats it for the view model! Now to wire it up to the MainWindow.xaml. Find within MainWindow.xaml the grid that has a comment indicating it is where to add view code. We want to add a date from and to as well as a button to perform the import:

5.5 Test the pulling of data

Great now everything should be wired up (except the Sage portion of course), lets give it a run. Run the application, edit the configuration with the correct sql information and hit import. Placing a breakpoint at the end of the Import function can help us see what came out of the function (hopefully the invoice detail rows). If an error occurs note that the program doesn’t crash – the async command handles these cases for us and politely shows a textbox regarding the issue.

5.6 Implementing Sage Import

At this point this becomes so project specific that I won’t bother going into too much detail on this case. But here is the general idea behind implementing Sage interactions:
1. Do it in the Sage UI, or at least something similar to it
2. Replicate what you did in the UI this time running Record Macro before
3. Stop Recording, and copy the macro over to the Macro Converter to get C# code (find latest at \\build1\Artifacts\400 Tools)
4. Copy over the C# to a new function in SageService
In our case we will create a new method
public void Import(IEnumerable lines)
Now we simply add this to the view model’s import function and display a text box that it was successful after!

5.7 Finishing touches

Head over to AppSettings.cs and take a look at the constants there. You should change things to what makes sense for your project (you will likely change the Project Repo # and the name).
Open the project’s properties. Change the Assmbly name.
Still inside project properties open up Assembly Information and change the Title and Product.

Hutility AccPac Template 2017 10 11 Changes

There are a number of changes made to our template in order to make it easier to use.

1. AppViewModel base class replaces TemplateViewModel

Rather than separating ViewModel from TemplateViewModel, ViewModel will inherit from AppViewModel. This way properties can be put directly in the base class and the ViewModel class will have less bloat. 

2. MenuViewModel, MenuView are removed

The menu and the bound data are moved to MainWindow and AppViewModel respectively. This makes it much easier to add new menu items – just add the commands in your view model and add the menu item in the MainWindow.xaml.

3. removed, use AppSettings.cs directly

The previous system was somewhat confusing, it was easy to forget to run custom tool to regenerate this file. The new method for setting application data is to just use the class directly.

4. Configuration Window added

A frequent requirement for applications is for some configuration settings to be hidden. A good place for this is within the menu. While this is now much easier to do due to changes 1 and 2 above, a new basic EditConfiguration window is now available – Views/ConfigurationWindow. The datacontext points to the view model so you just have to hook it up and uncomment the menu item line in MainWindow.xaml:

 5. A few more examples

The Config will have an example of a password (using EncryptedString) and the Configuration window will have an example of binding to it (using BindablePasswordBox), use these as an example of how to handle secure passwords.

6. HuLib 1.0.4

The template now uses the latest version of hulib (as of 10/11/2017).