Monday, February 20, 2017

Converting a DataTable to objects in C#

By Steve Endow

Let's say you have a C# Dynamics GP integration that retrieves Customer records from a staging table and you need to convert each row to a Customer object so that you can more easily work with the data.

You could loop through the rows in your DataTable, read every field in the row, and assign each field value to each corresponding object property.  But that would be tedious, particularly if your object has dozens of properties.  And it's very low value coding.

After having done this a few times, I got sick of all of the typing and figured there had to be a better way.  Of course there is.  Fortunately a very smart person posted some code that handles this task very well, using Reflection.

https://codereview.stackexchange.com/questions/30714/converting-datatable-to-list-of-class

It works incredibly well and requires all of 1 or 2 lines of code to use.

I made a few different variations, allowing me to use it with a DataTable, DataRow, and DataRow arrays.


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Reflection;
using System.Data;

namespace TSM.Integration.Library
{
    static class ObjectMapper
    {
        ///

        /// Converts a DataTable to a list with generic objects
        ///
        /// Generic object
        /// DataTable
        /// List with generic objects
        public static List DataTableToList(this DataTable table) where T : class, new()
        {
            //From: https://codereview.stackexchange.com/questions/30714/converting-datatable-to-list-of-class

            try
            {
                List list = new List();

                foreach (var row in table.AsEnumerable())
                {
                    T obj = new T();

                    foreach (var prop in obj.GetType().GetProperties())
                    {
                        try
                        {
                            PropertyInfo propertyInfo = obj.GetType().GetProperty(prop.Name);
                            propertyInfo.SetValue(obj, Convert.ChangeType(row[prop.Name], propertyInfo.PropertyType), null);
                        }
                        catch
                        {
                            continue;
                        }
                    }

                    list.Add(obj);
                }

                return list;
            }
            catch
            {
                return null;
            }
        }


        public static T DataRowToList(this DataRow row) where T : class, new()
        {
            //Variant of DataTableToList code from: https://codereview.stackexchange.com/questions/30714/converting-datatable-to-list-of-class

            try
            {
                List list = new List();

                T obj = new T();

                foreach (var prop in obj.GetType().GetProperties())
                {
                    try
                    {
                        PropertyInfo propertyInfo = obj.GetType().GetProperty(prop.Name);
                        propertyInfo.SetValue(obj, Convert.ChangeType(row[prop.Name], propertyInfo.PropertyType), null);
                    }
                    catch
                    {
                        continue;
                    }
                }

                return obj;
            }
            catch
            {
                return null;
            }
        }


        public static List DataRowArrayToList(this DataRow[] dataRowArray) where T : class, new()
        {
            //Variant of DataTableToList code from: https://codereview.stackexchange.com/questions/30714/converting-datatable-to-list-of-class

            try
            {
                List list = new List();

                foreach (var row in dataRowArray.AsEnumerable())
                {
                    T obj = new T();

                    foreach (var prop in obj.GetType().GetProperties())
                    {
                        try
                        {
                            PropertyInfo propertyInfo = obj.GetType().GetProperty(prop.Name);
                            propertyInfo.SetValue(obj, Convert.ChangeType(row[prop.Name], propertyInfo.PropertyType), null);
                        }
                        catch
                        {
                            continue;
                        }
                    }

                    list.Add(obj);
                }

                return list;
            }
            catch
            {
                return null;
            }
        }

    } 
}



You can also find him on Twitter, YouTube, and Google+










Thursday, February 16, 2017

Don't Forget- Standard GP SQL Views

From time to time, we still get inquires from folks building views in SQL server that actually already exist.  So I thought I would post a quick reminder that every SmartList has a SQL view that corresponds and can be used for your own purposes as well (e.g., SQL reports, Excel queries, SmartList Builder, etc).  And, remember, you can link views together as well as to other tables when creating reports.  Just don't modify the standard views (if you need to add to them, just create a new once with the same design and then modify).  Here are some of the most common ones available (this is NOT all of them) on any GP database...


  • AATransactions
  • Accounts
  • AccountSummary
  • AccountTransactions
  • BankTransactions
  • AttendanceDetail
  • BillofMaterials
  • CertificateList
  • Customers
  • EmployeeBenefit
  • Employees
  • EmployeeSummary
  • FixedAssets
  • FixedAssetsBooks
  • FixedAssetsPurchase
  • InventoryPurchaseReceipts
  • ItemQuantities
  • PayrollTransactions
  • PayablesTransactions
  • PurchaseLineItems
  • SalesLineItems
  • Vendors
When I teach beginner reporting classes, I advise students to always "look twice" for a standard view before embarking on creating new views or combining open/history/work tables in a SQL statement (as often the views already do this for you).  Good luck and happy reporting!


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Customer Service and Failure

I hate car problems.  This is a fact of my life.  My dad was a car guy.  My brother is a car guy.  But I cringe every time I have to deal with car issues.  Fortunately, we have a mechanic who we trust and have taken both of our cars to for years.  (See the parallels already with a software partner/consultant?).  So, anyway, driving home last Friday my check engine light came on.  I did my normal scary internet searching for some basic things to try, and we cycled through those over the weekend (again, anyone picking up on the parallel to working with your software solution?).


Finally, on Tuesday, we caved and took it to our mechanic.  Who we like, but always secretly cringe because we don't know enough to know how much it will cost to fix.  Our mechanic fixed the issue (for those that are wondering- engine oil pressure sensor malfunction), although naturally it was a bit more than I wanted it to be (I wanted the under $100 fix of course!).  So I am sure by now, you are wondering why (despite the clever parallels) I am blogging about car problems on a blog devoted to Dynamics GP and software implementation?  Well, it is what came next that I think is a testament to how you think about customer service and approach failures with software and with partners.


On Wednesday morning, I woke up and got myself and the kids ready for the day.  I loaded the car with 20 cases or so of girl scout cookies (our office did a cookie pool to support all of the girl scouts in the office) then we loaded up to head to drop-off and work.  As soon as I started the car, I knew I had a problem- horribly rough idle and then the warning lights started flashing and next thing I know the car won't go faster than 10 mph.  Ugh. Ugh. Ugh.  Transfer cookies and kid and laptop to our other car, and call the mechanic.  When I talked to one of his employees, I was told they would either come out and get it or have it towed.  A couple hours passed, and I had not heard from them so I texted my husband to see if he had.  His text back was a simple "Yes, they came out and its fixed and they are test driving it around the neighborhood."


So there you go.  A mechanic who came to our house (his wife watched the store while he and his employee came out) and fixed something that was not tightened enough after the original repair.  Now, I know that some of you might think "dang right he came to your house to fix his own mistake" but I actually think of it totally differently.


Mistakes are inevitable in our work. We are human.  Software (and automobiles) are complicated.  We multi-task constantly with different clients, project, and even software.  Now, do we expect failures to be common?  No (this would be the first time in many years that we have had to call our mechanic back after a repair).  But I would argue that true customer service lies in how we respond to failures, do we....
  • Take on a proactive mindset?
  • Bring "solutions" to the table?
  • Skip the defensiveness and blame game?
  • Go the extra mile to resolve the issue?
I would argue that how we respond to failure as partners builds customer loyalty because failure is unavoidable at some point in a business relationship.  We deal with imperfect people, teams/organizations (clients and partners), and software.


In talking with the project managers where I work, we often discuss that projects will have bumps.  Trying to manage to avoid any bumps at all will leave you exhausted, ineffective, and reactionary.  But by understanding that projects will have bumps (miscommunications, missed expectations, etc) you are not "lowering the bar" (we continue to strive for excellence).  But you are, by expecting the occasional issue, adopting a proactive, pragmatic, and risk-adverse mindset- looking to manage the bumps, how we respond, and how we engage with the client for ultimate project success.


Look for the customer service in the failures.  That is where you will find it.  And that is where you will build the lasting partnerships (both internal and external) that will allow you and your organization to succeed.
Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.

Monday, February 6, 2017

Integration Manager eConnect Adapter Error: SQL Exception Thrown in the GetNextNumber method

By Steve Endow

I was trying to do a test in Integration Manager to confirm that the eConnect GetNextNumber methods work properly.  Simple enough.  I setup a test GL transaction with the eConnect destination adapter import and ran it.  After sitting for several minutes with no activity, I clicked on Stop and after a few more minutes, was finally able to see an error.


SQL Exception Thrown in the GetNextNumber method


Hmmm. Okay, so what is causing this?

I checked my SQL Server, but I didn't see any errors.  I then tried a SQL Profiler trace, but it just displayed some strange queries against the company databases with no references to GetNextNumber.  I checked SQL Activity Monitor, but didn't see any issues there either.  I was stumped.

Then I remembered one setting from when I created the integration.


Notice that the Server Name defaulted to localhost.  Since IM is running on the same machine as my SQL Server, I didn't think twice about it--even though I never use localhost for SQL Server connections (for a good reason, as I am demonstrating).

Well, the problem is that this server uses a SQL Server instance name, not the default SQL instance, so localhost will not work.  I changed the Server Name to the correct SQL Server instance name and the integration ran fine, no issues.

I'm puzzled why IM didn't time out or didn't log meaningful error messages, like Unable to connect to SQL Server, etc.  Fortunately, it was a simple enough fix, once I realized the problem.

Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+








Integration Manager eConnect error: Could not load type System.Runtime.Diagnostics.ITraceSourceStringProvider

By Steve Endow

This is a quick note about an obscure error that I received when trying to use Integration Manager 2013 to run an eConnect integration.


eConnect error - Could not load type System.Runtime.Diagnostics.ITraceSourceStringProvider


The error isn't particularly meaningful, so there's no way to troubleshoot it directly.  Fortunately the error is unique enough that a Google will provide several results, such as this thread:

https://stackoverflow.com/questions/24291769/could-not-load-type-system-runtime-diagnostics-itracesourcestringprovider


The recommended solution is to install / reinstall .NET 4.5.2.  But I verified that I already had 4.5.2 on my server, so that seemed odd.

Then I re-read the original question--note that the user having the issue reports that they just upgraded their server from Windows 2008 to 2012?

Coincidentally, I also upgraded my server from 2008 R2 to 2012 R2 last week.  It would seem that the Windows upgrade process breaks something about the .NET 4.5 installation.

So I downloaded the .NET 4.5.2 web installer and reinstalled it.


After .NET 4.5.2 was reinstalled, the error went away.  Small victories!


Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+






Monday, January 30, 2017

Is a Test environment required anymore for Dynamics GP upgrades?

By Steve Endow

I've worked with several customers recently who have upgraded Dynamics GP to a new version without doing any prior testing.  The upgrade was performed in their production environment.  No test environment. No test database upgrade. No testing integrations or customizations prior to the upgrade.  Today I was informed of another customer that will be upgrading to GP 2016 without a test environment--just an upgrade of production.  Which made me wonder...


While there are probably many people who would rail against such an approach, I'm now asking a serious question:  Do you really need a Test environment anymore for a "typical" Dynamics GP upgrade?  I'm guessing that many GP customers could probably upgrade their production environment in place without any significant issues.

Yes, there are customers with complex environments that would definitely benefit from a Test environment, and yes, there are cases where upgrades encounter errors that cause the upgrade to fail, but I suspect there are a large number of GP customers with pretty simple environments where a separate environment and extensive testing is not required and would be difficult to justify.

Years ago, before Microsoft purchased Dynamics GP, GP upgrades could be a harrowing experience.  Both the GP install and upgrade processes involved many steps and the GP installers weren't nearly as refined as they are now.  One of the things I noticed following the Microsoft acquisition was that the GP installation and upgrade process became much simpler, easier, and more reliable.  Whereas I used to always recommend setting up a separate test server and performing a test upgrade first, I have worked with several customers recently who have simply upgraded their production environment without any prior testing of a new version of GP.

If you make sure to take good database backups, have a few GP client backups, and have a thorough upgrade plan that has a solid rollback contingency, is it really necessary to have a separate Test environment and perform a full test upgrade first?

Are there particular modules, customizations, environment considerations, or other factors that you think make a Test environment more import?  Third party modules?  Customizations?  Integrations?  Web client?  On premise vs. hosted?  Lots of data or company databases that causes the upgrade to take a long time?


Update: Tim Wappat made a good point on Twitter: Since many companies now run GP on virtual machines, you can easily backup the relevant VMs for quick rollback, and you can also easily clone VMs to quickly setup a "test" environment, greatly reducing the administrative costs of using a test environment to validate a GP upgrade.



Steve Endow is a Microsoft MVP for Dynamics GP and a Dynamics GP Certified IT Professional in Los Angeles.  He is the owner of Precipio Services, which provides Dynamics GP integrations, customizations, and automation solutions.

You can also find him on Twitter, YouTube, and Google+




Saturday, January 21, 2017

Riddle Me This: Fixed Assets and Long Fiscal Year

This one left me scratching my head, so I am up at 2am on a Saturday and thought I  would share.  Here is the scenario...


  1. Customer has a long fiscal year due to a change in their fiscal year
  2. The long  year has 14 periods, all years prior and after have 12 periods
So we adjusted the Fixed Assets Calendar (Setup-Fixed Assets-Calendar) to have 14 periods for the current year.  We also marked the option "Short/Long Year" and specified 116.67% depreciation (so that the 13th and 14th periods depreciate normally).


All ran great when the client depreciated period 13.  It is when we get to period 14 that things seem to go haywire.  When we run depreciate on period 14, it backs out the depreciation for period 13.  Creates a complete reversal entry.  The only items that depreciate properly are those items placed in service in periods 12, 13, and 14.  Odd, right?  Well, wait, it gets better...


I can replicate all of this in sample data on GP2015 (the client is on 2013, so wanted to be as close to that version as possible).  So I started wondering what would happen if I backed out the period 14 depreciation. So I did that.  Re-ran depreciation for period 13, and it backed out the incorrect entry.  But then if I re-ran depreciation for period 14, it calculates correctly.  What?  Why?  Simply backing it out and rerunning it appears to fix the problem.  Not normal, right? 

From what I can tell, it has to do with reset life and perhaps the back out process triggers a recalc of sorts.  Because if I pre-emptively run reset life, period 14 will depreciate properly the first time around.  I think there is some conflicting info out there about the need to run reset life if you are creating a long year, but you heard it hear first...always run reset life if you alter (even just lengthening) a year in fixed assets.


Christina Phillips is a Microsoft Certified Trainer and Dynamics GP Certified Professional. She is a director with BKD Technologies, providing training, support, and project management services to new and existing Microsoft Dynamics customers. This blog represents her views only, not those of her employer.