D365BC – Data Backups

Currently D365BC is missing a couple of things, the licenses could be cheaper. But it’s the successor of Dynamics Nav. Now, some time after the release (name was changed quite often, feature list was changed very often, extension base tech was changed, a.s.o.) i decided to find at least one method for backing up data, a very missed feature. Maybe Microsoft will add that kind of feature in the future, nobody knows. But … in the meantime checkout this.

First i checked out the feature list of current D365BC, what can be used for that functionality or is an extension needed? So in the end i found the “Data Exports” in the finance module, a quite well known feature in NAV, which can be used for exporting data to text files. I worked with D365BC – Austria. Currently i don’t know what in detail are the differences between the localized versions, so please check, if data exports is available in “your” D365BC. 😉 If you cannot find it – should be, because the export report has a object number within the localization range – write to microsoft and let them know, that you want that feature in your localized version. 😀

Ok, starting with page “Data Exports” we can add one new data export, let’s call it BACKUP. Fill out the code and description fields.

Now we add a record definition set called “MASTER” for the master data. For that click on the according button, fill out the code and description fields.

One important thing: the dtd file, the file that contains the data structure. Without a dtd file it’s not possible to export data. In that case this can be done very simple, a concrete structure is not needed, only the xml declaration. Create a new text file, call it default.dtd, add following text line and save the file:

<?xml version=”1.0″ encoding=”UTF-8″?>

Now click on Import in menu group “Dtd File” and import that file. The file’s name is then added to the selected data record defition line.

Now lets go to the details. Click on button “Record Source”.

Here now add the master tables you want to export, e.g. tables 3, 4, 6, 9, 10, 13, 14, 15, 18, 23, 27 using the “New” button. With e.g. date filters you can limit the data to export. In the column “Export Filename” you get a suggested filename for the export file, per default ends with txt. Better change to csv. So the files can after the export easily opened in excel.

Also add all the fields, which you want to export, using the “Add” button in the Fields/Manage menu for the selected table.

That’s it. Go back to the parent page: Data export record definitions. There click on button “Export” in menu group “Process”.

After clicking OK a zip file is created and downloaded containing all the exported csv files. Alternatively you can schedule the export for a later point of time.

Also included is a file index.xml containing base data like company data, export name, list of exported files and the field names.

So, what we get here is a first backup solution for master data. At least we can export the current set of the master data at any time. You could create another data record definition for transaction data. After you have done that you’ll get in the end a complete backup solution for D365BC. 😀

Additional you could create a job queue entry for automatic export. Use report 11015 and set the recurring flags.

Set the request page Options as following:

cheers

 

Get next object version number

Every developer has sometimes the issue: What is my next version number for the new object ? If there is a couple of developers working in the some database, you’ll need a kind of a version control.

I’ve developed a page, which calculates the next version number for a defined version prefix. The version sytax is: <PREFIX><Main No.>.<2-digit Sub No.>.

Let’s start with a list of pages with different version list, version ARCH1.00 to ARCH1.06. It does not matter, if the version list of an object contains more than one value.

After running the page set the version prefix, here ARCH, and push action “Get next version”. We get then the new version: ARCH1.07.

You can download the page here.

 

Export Nav Objects by Code

Exporting locked nav objects can be a problem when importung in target database. it’s not that easy to unlock them in the target database. So for that you can export nav objects by code and check Lock status before exporting.

Create a new report, add dataitem Object. set report to processingonly.

select dataitem Object, set property ReqFilterFields to Type,ID.
set request page: add field Path (Text).

add following code to trigger OnOpenPage:

Object.SETRANGE(Type,Object.Type::Table);Path := 'c:\temp';

add following code to report trigger OnPreReport():

finsql := 'C:\Program Files (x86)\Microsoft Dynamics NAV\100\RoleTailored Client\finsql.exe';IF NOT FILE.EXISTS(finsql) THEN  ERROR('finsql not found');

additional add that code to trigger Object – OnAfterGetRecord():

IF Object.Locked THEN BEGIN  Message(FORMAT(Object.Type)+'-'+FORMAT(Object.ID)+' is locked.');  // alternatively unlock object, then try again.  // Object.Locked := FALSE;  // Object.MODIFY;end ELSE begin  arguments := 'command=exportobjects,file=%1,servername=%2,database=%3,filter="Type=%4;ID=%5",ntauthentication=1';  arguments := STRSUBSTNO(arguments,Path+'\'+FORMAT(Object.Type)+'-'+FORMAT(Object.ID)+'.fob','localhost','Cronus',Object.Type,Object.ID);  Process.Start(finsql,arguments);  result := result + FORMAT(Object.Type)+'-'+FORMAT(Object.ID)+'\';END;

to trigger Object – OnPostDataItem():

MESSAGE(result);

Global Variables:

Process DotNet System.Diagnostics.Process.'System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'arguments Textfinsql TextPath Textresult Text

cheers

How to work with big item descriptions

Sometimes it’s needed to save very long descriptions, but in Nav text fields can have only 250 characters. Additional you may want to search in these long text values. You can add a couple of these text fields to save long texts or save the text in text files and add them to the item. But what about searching? Not that easy.
An other option to save long texts is the usage of blob fields. For that option i developed a solution.

First add a new field “Description 3” to table Item, type BLOB, subtype Memo.
Then edit page Item Card, add a global variable Desc3Txt of type text with no length. Add the variable as new field to the item card, Editable=False, MultiLine=Yes.
Add following code to trigger OnAfterGetRecord in item card page:

// InStr | InStreamCALCFIELDS("Description 3");IF "Description 3".HASVALUE THEN BEGIN  "Description 3".CREATEINSTREAM(InStr);  InStr.READ(Desc3Txt);END;

Add to trigger Desc3Txt – OnAssistEdit()

// OutStr | OutStream // EditCtrl | DotNet | Archer.TextEdit.'Archer.TextEdit, Version=1.0.0.0, Culture=neutral, PublicKeyToken=1465b259ee2284cb' CLEAR(EditCtrl);EditCtrl := EditCtrl.TextEdit;EditCtrl.Load(Desc3Txt);EditCtrl.ShowDialog;Desc3Txt := EditCtrl.Save;EditCtrl.Close;CLEAR(EditCtrl);"Description 3".CREATEOUTSTREAM(OutStr);OutStr.WRITE(Desc3Txt);MODIFY;CurrPage.UPDATE;

Page item card with new multiline text field “Description 3” and Assist Button.

Clicking on Assist Button starts the TextEdit Control and loads the current text. After changing the text and closing the text control, you are asked, if you want to change the text.

Now we need the opportunity to search within the new text/blob field. For that we need a new page Item Search. That new field cannot be searched using the standard search function.

Create a new page with objectid 50000, name Item Search. Add a group under the contentarea, add a global text variable SearchString, add a new field under the group with SearchString as SourceExpr. Add another group, add a new line with type part. Later we set the value for property PagePartID.

To show the search result we need another page: type listpart, objectid 50001, name Item Search Result. As source of the new page we need a new table: objectid 50000, name Item Search Result.

The new page 50001:

Properties: Editable=False, SourceTable=50001, SourceTableTemporary=Yes.

Now add global function SetData(SearchFilter : Text) to the new page. Add following code to the new function:

// local variables// Item, Record, Item // ItemSearchResultLine, Record, Item Search Result // InStr, InStream // Desc3Txt, Text // LineNo, Integer DELETEALL;LineNo := 10;Item.FINDSET;REPEAT  Item.CALCFIELDS("Description 3");  IF Item."Description 3".HASVALUE THEN BEGINItem."Description 3".CREATEINSTREAM(InStr);InStr.READTEXT(Desc3Txt);IF STRPOS(LOWERCASE(Desc3Txt),LOWERCASE(SearchFilter)) > 0 THEN BEGIN  "Line No." := LineNo;  "Item No." := Item."No.";  Description := Item.Description;  "Description 2" := Item."Description 2";  "Description 3" := COPYSTR(Desc3Txt,1,250); // first 250 chars  INSERT(FALSE);  LineNo += 10;END;  END;UNTIL Item.NEXT = 0;CurrPage.UPDATE(FALSE);

Now you can set the property PagePartID in the part line in page 50000 to 50001.

For calling the search function we need a Search button in page 50000.
Add following code to trigger Search – OnAction()
CurrPage.ItemSearchResultLines.PAGE.SetData(SearchString);

The new Item Search Page with a search result.

You can download the TextEdit Control here.

Followup:
You could simplify the solution by:
* Search page: Use only page 50001, add a second group at the top with field SearchString. So page 50000 is not needed.
* Page Item Card: remove the textedit control and the according code, set field Desc3Txt to editable, add the code to fill the blob field “description 3” using outstream to trigger Desc3Txt-OnValidate.

cheers

 

Mass data import

There was an issue with importing all UK post codes from a csv file, about 3m post codes. Importing using rapidstart services (excel import) can cause buffer overflow messages. excel itself has row/size limitations. Increasing MaxNoOfXMLRecordsToSend in config file ClientUserSettings.config from default value 5000 to e.g. 20000 is no problem and can help. Also changing MaxUploadSize in server config file CustomSettings.config is an option (also available via nav service admin console). Better choice for mass data import are dataports (older nav versions) and xmlports.

Another option is to develope a report, which imports the file contents and loops through the lines. quite simple, no memory issues.

create a new report, add following code to trigger OnPreReport():

OnPreReport()// variables PostCode, Record, Post Code file, File fileName, Text, 250 line, Text, 1024 dlg, Dialog idx, Integer txtValue, Text, 100// code PostCode.DELETEALL(FALSE); COMMIT; dlg.OPEN('#1###### #2######'); // show progress dialog idx := 1;// downloaded post code file from https://www.doogal.co.uk/PostcodeDownloads.php// as test file, size: 500k, 2.1m lines, some of them contain obsolete post codes fileName := 'c:\temp\England postcodes.csv'; file.WRITEMODE := FALSE; file.TEXTMODE := TRUE; file.OPEN(fileName); file.READ(line); // skip header line WHILE file.READ(line) > 0 DO BEGIN   // skip obsolete post codes: 2. value = No   IF SELECTSTR(2,line) = 'Yes' THEN BEGIN  PostCode.Code := SELECTSTR(1,line); PostCode.VALIDATE(City,GetValue(SELECTSTR(15,line))); PostCode.County := GetValue(SELECTSTR(8,line)); PostCode."Country/Region Code" := 'GB'; PostCode.INSERT; dlg.UPDATE(1,idx); dlg.UPDATE(2,PostCode.Code); idx += 1;   END; END; file.CLOSE; dlg.CLOSE; MESSAGE(FORMAT(idx) + ' post codes imported.');LOCAL GetValue(txtValue : Text[100]) : Text txtValue := DELCHR(txtValue,'<','"'); // remove leading "   txtValue := DELCHR(txtValue,'>','"'); // remove trailing " IF STRLEN(txtValue) > 30 THEN // fields City,County are Text[30]   txtValue := COPYSTR(txtValue,1,30); // cut text, leading 30 chars EXIT(txtValue);

report runs with 1.5m valid lines/records about 5 min.

cheers

List pages in Nav 2017 – a new look and feel

TharangaC has written an interesting posting about one of Nav 2017’s new features. Maybe not really a new process or technical feature. It’s more a Wow!

This web client feature is the new optional style of list pages . It’s possible to show the items in a list like in a catalogue, has more look and feel of a phone/tablet’s convenient app. Seems that the webclient gets more cool features than the windows client.

For the complete blog post follow here.

cheers

 

NAV 2017 ready for download

Just checked partnersource. NAV 2017 first release is ready for download on partnersource. Great thing! Seems that there are more new features developed for the web client. A.i.k. the windows client will not be continued, what can be a reason.

A complete “What’s new list” you’ll find here.

  • Office 365 Experience
  • Embedded Power-BI
  • Better/Easier Setup and Configuration, Setup wizards
  • Notifications for the Web Client
  • Explorer like lists with thumbnails
  • Finance Enhancements: Default Account Schedules, easier Reporting
  • Jobs Enhancements
  • CRM: New Wizard
  • Item attributes: long missed
  • Integration: Payment services, Paypal, OCR Line recognition
  • Cortana Intelligence: Sales and inventory forecasts
  • PowerApps & Microsoft Flow

Not all features are available in the first release.

So let’s start with NAV 2017. Well done, Microsoft. 😀

Cheers

Textual Data Export by Configuration

An often by customers wanted feature is to get a data export to a text/csv file, which can then be edited in excel or text editor.

Yes, you can use Rapidstart Services. But with that feature you can only export data in excel format, not in text format. That’s ok for editing in Excel and reimport the changed data back to NAV. For data exchange scenarios you always need text format. An other point is, that it’s quite often said, Rapidstart Services are not that easy to use for end users and has a couple of bugs. I also read quite often that this feature is not recommended at all. So what else can a customer do? Contact the NAV partner, who then shall develope a new xmlport, report or codeunit to do that job. That’s the usual way.

The german localised version of Dynamics Nav contains a really nice feature, simply called “Data Exports”. You can find it in menu /Departments/Administration/Application Setup/Financial Management/General/Data Exports. This feature is mainly used to export business data for auditing purposes according the GDPdU (Process for data access and testability of digital documents).

With the feature “Data Exports” there is an additional way for these kind of issues, it is possible to export data to csv files without developing! Great thing. So let’s have a look, how we can use that.

First we need at least one definition group. Then we define the according record definition (Button “Record Definitions”).

Here we need values for Code, Description and Export Path. In that case i want to export Sales Orders. “DTD File Name” is a mandatory field. I checked the code. The file is not processed, using different kinds of dtd files does not change anything in the resulting export files. The name of the dtd file is simply written to the file index.xml, which contains the datastructure. To get the thing run we create an empty text file, call it empty.dtd and add only one line of text:

“”

Save the text file and click on Button Import in menu tab “DTD File”, select file empty.dtd and click ok. After that the file is imported and the file’s name is set to column “DTD File Name”.

Now let’s define the table and the fields. For that select the record definition and click on “Record Source”.

In the header area we set table no. to 36 (Sales Header) and the Key No. to 1 (Primary Key). Optional you can set the period field no., here to 19 (Order Date). In Column Table Filter you can set filters, if you do not want to export all records. Here i set the field to “Sales Header: Document Type=Order”, because i only want to export Sales Orders.

In the Fields Area we add the export fields by clicking on the Add Button. The key fields should be in first place. With MoveUp and MoveDown you can change the position of the fields. That’s all.

You can, if you wish, add more tables in the header area, you can set relationships, e.g. between Sales Header and Sales Line. But for now let’s run the thing. First we click Validate and get “The data export record source validated correctly.”, means all is ok.

So, after closing the page “Data Export Records Source” we are back in page “Data Export Records Definitions”. Here we click on Button Export in menu tab “Process”. Report 11015 “Export Business Data” is started:

Here set start and end date. These fields are mandatory and are applied to the period field of the header line, in that case to field “Order Date”. After execution we get some files in the export path, subfolder SALESORDER.

File empty.dtd is simply copied, file index.xml contains the data structure:

and file SalesHeader.txt contains the exported data (csv format):

There we are!
We got a csv export file without any development, only by configuration!

This can, as told before, extended by adding more lines in the header area of page “record source”:

Here you can see 2 header lines “Sales Header” and “Sales Line”, which is indented and has a defined relationship shown in page relationship in the bottom right corner of the above screenshot. In that case you get 2 export files, one per table.

Important: If you are interested in that feature, you could download the german localised version. But … for that you need the according license!
Exporting the nav objects as text export and renumbering them to the 50000s object range is possible, but illegal! So if you like that kind of feature, try to redevelope that functionality. Do not copy these objects to your database, do not reuse the code. If you want to do that, please contact your Nav partner or Microsoft for license clarifcation. You could suggest to add that functionality to the W1 version.

For more details about that feature follow this.

cheers

Item Attributes on NAV 2016

Item Attributes were introduced with Nav 2017, a long missed feature. Nareshwar Raju Vaneshwar has written an interesting posting, how to downgrade that feature to Nav 2016. It’s quite easy … so have a look …

We were very impressed by one of the features of NAV 2017 – Limited Beta, Item Attributes. Since NAV 2016 is widely popular, I thought of bringing back the feature to NAV 2016 as well. I have…

Source: Item Attributes on NAV 2016