I am afraid that I will have to rethink my design for the gift batch validation. However, I wanted to check to be sure.

In order to create the Gift Batch Import validation routines, I was planning to add a helper object, CSV Parser, for the existing gift import. The division of responsibility that I envisioned was that the ImportGiftBatches method would continue to manage files and database interactions. It would retrieve a single record from the CSV import file and call the appropriate method in CSV Parser. CSV Parser would receive the CSV record, parse it, and return a "free-standing" table row (or rows, in the case of gift/gift detail) along with a Verification Result Collection of any parsing errors. ImportGiftBatches would then call the existing validation framework to complete the verification of the returned row. If edits were passed, it would then apply the appropriate key to the row and insert it into the database.

The benefit that I saw with this structure was that it seemed very clean from a testing point of view. Because each component would have minimal interdepency, unit test requirements could be more clearly envisioned and setup would be minimized.

Is it possible with the current data set objects and methods to create independent table rows? And later put them into the connected dataset and update the database? I am now thinking that my plan won't work because it is only possible to create gift batch batch, gift and detail rows which are included in a dataset actively linked to the database. Is that correct?

Hi Doug,

the design sounds clean to me.

There is no problem with creating Typed DataRows of Typed DataTables that eventually will be commited to the DB through our DataStore Methods in memory only.

Example:
If you would want to create Typed DataRows of Type AGiftDetailRow in the CSV Parser do the following:

1) In the CSV Parser: create a Typed DataTable object that represents the DB Table that the DataRows should end up in in the DB, e.g. new GiftBatchTDSAGiftDetailTable GiftDetailDT = new GiftBatchTDSAGiftDetailTable("AGiftDetail"); (You can find out which Type you need to use for a particular Typed DataTable by inspecting what Type the Typed DataTable is in the Typed DataSet (e.g. by hovering over a FMainDS.AGiftDetail statement in code - you will see it is of Type 'GiftBatchTDSAGiftDetailTable'.)
2) Create a Typed DataRow (that is populated with default values, if you want that to happen): AGiftDetailRow GiftDetailsDR = GiftDetailDT.NewRowTyped(true);
3) Perform various assignments to set up data in the DataRow, e.g. 'GiftDetailsDR.LedgerNumber = x'.
4) Add the DataRow to the Typed DataTable, e.g. 'GiftDetailDT.Rows.Add(GiftDetailsDR);

At the end of the CSV Parser Method you return GiftDetailDT to the calling Method (ImportGiftBatches) which stores it in eg. 'TmpGiftDetailDT' and which can then simply merge the DataRow that you created in the CSV Parser into the Typed DataSet that will in the end get committed to the DB (FMainDS) by issuing FMainDS.Merge(TmpGiftDetailDT) - that is, if the CSV Parser issues no serious Data Validation problems for that DataRow. You can repeat that process with as many rows as you want and will end up will all the Typed DataRows being added to the Typed DataTable FMainDS.AGiftDetail, which will in the all end committed to the DB e.g. using AGiftDetailAccess.SubmitChanges(FMainDS.AGiftDetail).

I hope that is clear; in case you have further questions don't hesitate to make further inquiries.

Great!

As I'm coding, I an noticing that there seem to be two detail table objects?
AGiftDetailTable
GiftBatchTDSAGiftDetailTable (which inherits from AGiftDetailTable)

And then there are two detail row objects as well
AGiftDetailRow
GiftBatchTDSAGiftDetailRow

Does it matter which one I use?

Hello Doug,

you can see the difference (additional columns) here:

github.com/tpokorra/openpetragi ... taSets.xml
The GiftDetail table has been extended for the grid, so that we don't need a separate grid for the a_gift table.

So for saving, you can ignore the additional fields.

In general, I worry if you are making the validation of gift imports too complicated. I don't want another very complicated structure that would be difficult to maintain.

I prefer it simple: get all the data from CSV into a dataset, as it is already done. If something fails already during parsing, return an error immediately.
Then make a validation, to test for valid partner keys etc. Collect those errors in a Verificiation Result collection.
Usually, the users import valid files.
It is an open source project after all, so I would wait for real user feedback, and then invest more into the gift import, if users actually say that it is not nice enough.
Just my opinion, Matthias has the last word.

Timotheus

Well, actually, its more a case of it being complicated to write simple code :)

That is, my design is really quite straight forward. And it simplifies writing and maintaining the unit test code as well.

What has been a bit complicated (or at least slow in my limited time) is to gain understand of the workings of the many OpenPetra framework objects, interfaces, inheritances, et al. that I need to use properly in order to achieve my ends.