Overview
To fully leverage the Datasets Module its essential to understand various advanced aspects such as:
- Managing parallel requisitions.
- Doing synchronous or asynchronous requests.
- Mapping data to a server or client-side tags, the application, performance, and other items.
On this page, you findadvanced topics to better understand these aspects of the Datasets Module
Introduction
This document provides an in-depth guide on how to establish connections between your FactoryStudio project and Microsoft Excel files. Excel is a widely used spreadsheet application, and integrating data from Excel files can greatly enhance the flexibility and efficiency of your project. The ability to connect to Excel files allows for seamless data exchange and collaboration between your project and various data sources.
This guide will cover various methods for connecting FactoryStudio with Excel files, including the use of ODBC (Open Database Connectivity), the ODBC Microsoft Excel Driver, configuring ODBC with a Data Source Name (DSN), and utilizing OleDB. Each of these techniques offers unique advantages and capabilities, ensuring that you can find the most suitable approach for your specific needs.
On this page:
Table of Contents | ||||
---|---|---|---|---|
|
Datasets Module Execution
The Dataset module facilitates efficient database interactions by utilizing TServer services. While it doesn't directly interact with databases, it manages synchronous and asynchronous executions for optimal performance. Understand the differences between these methods by reading more on the Datasets Module Execution page.
Data Management
The Dataset Module offers versatile methods for managing data and concurrency within solutions, including Data Table tags and Async Contents. Understand how to avoid conflicts and optimize performance reading more on the Data Management page.
Datasets Runtime Attributes
The Datasets Namespace exposes properties and methods from the .NET objects used by the Historian Module execution. You can use these properties and methods on your Displays or to create Scripts and Alarms. The Dataset Runtime Attributes page lists all options available.
Preventing SQL Injections
When calling Stored Procedures in the database, there's a significant concern about SQL injection, especially when passing parameters as plain text in SQL statements. To mitigate this, we utilize the .NET API, where parameters are added to a list, effectively preventing code injection. Therefore, we advise using parameters in Stored Procedure calls with the following syntax:
Code Block |
---|
execute <StoredProcedureName> <@paramName1>={Tag.Tag1}, <@paramName2>="{Tag.TagStr}", <@paramName3>={Tag.OutValue} OUTPUT, ... , @RETURN_VALUE={Tag.RetValue} RETURNVALUE |
This approach parameterizes Stored Procedure calls and guards against SQL injection attacks. Each parameter value is strictly interpreted as a value and not part of the SQL command, providing a critical layer of security against unauthorized database manipulation.
Network Gateway Access And Time Zone Handling
In some scenarios, specific databases might not be accessible from all computers due to network topology or security constraints. The ServerIP column can redirect database access commands to a machine with the necessary permissions, provided you install our platform on that machine. Only the TWebServer would be active in such cases, handling database interactions, receiving commands, and establishing connections. This method offers a workaround for database access restricted to specific machines.
Handling Time Zones
The platform treats all Date and Time type tags as UTC. Consequently, the dates written to and read from the standard database, such as historical data and alarms, are also in UTC. If it is necessary to read and write dates in the local timezone, account for the time difference when preparing the SQL statement or using the WhereCondition property. Other external data tables, from third parties or the client, can use any timezone. To assist in handling this data, you can configure the DateTimeMode column with the type of timezone used in the database (UTC or Local Time), and the system will automatically convert it when reading and writing to tags.
Dataset Module Performance
The Dataset Module's performance depends on many factors, including database performance, network latency, and the complexity of executing SQL queries. The platform will minimize overhead and execute queries as efficiently as possible. However, ultimately, the performance of the Dataset module is tied to these external factors. It's essential to design your database schema and queries with performance in mind and consider investing in high-quality hardware and network infrastructure to ensure optimal performance.
Error Handling
Error handling in the Dataset module is straightforward. If an error occurs during the execution of a command, the error message will update the module's Error property (Last Status). You can monitor this property to handle errors in your application. Furthermore, if an error occurs during the execution of a synchronous method, the process will return an empty Data Table and update the Error property. Alternatively, you can call methods like SelectCommandWithStatusAsync, where the status will be an output parameter in the method.
These are some key points you should keep in mind while using the Dataset module. It's a powerful tool that provides a lot of flexibility. However, it also requires careful use and understanding of its features and limitations. Always keep best practices in mind when designing and implementing your database interactions. Feel free to seek expert advice if you need clarification on anything.
Backup of SQLite Databases
There are several ways to back up SQLite databases in a solution. One simple method is to make a copy of the database file itself, but this method may not be suitable if the database is actively being written to or is very large. Here are a few suggestions for backing up SQLite databases in your solution:
SQLite provides a built-in backup command that can be used to create a backup of a database file. The syntax for this command is as follows:
Code Block sqlite3 source_database.sqlite .backup backup_file.sqlite
This command creates a backup of the source database and stores it in a new file called
backup_file.sqlite
. You can then copy this file to a safe location for backup purposes.- SQLite also provides an Online Backup API that you can use to create a backup of a database file while your application is still using it. This API offers several benefits over the backup command, such as the ability to perform incremental backups and monitor the progress of the backup process. Read more in the SQLite Backup API documentation.
Regardless of your chosen method, it's crucial to store your backup files in a secure location and test them regularly to ensure that you can use them to restore your database if needed.
In this section:
Microsoft Excel Connections
Using ODBC
This method offers a standardized and widely supported method for connecting various applications to databases, ensuring compatibility across diverse data sources.
To connect FactoryStudio with an Excel file, first, select and name a range of rows and columns in the worksheet. This will allow the software to read the information as a table. Choose one of the following naming processes based on your version of Microsoft Excel:
For Microsoft Office 365:
Navigate to Formulas → Define Name → Define Name...
For Microsoft Office 2007:
Right-click the selection and choose Name a Range....
For Microsoft Office 2003
In Microsoft Excel, go to Insert → Name → Define....
Name the selection (e.g., "itemsTable"). The Excel file is ready to be used.
Using ODBC Microsoft Excel Driver
The ODBC Microsoft Excel Driver provides a dedicated driver for Excel, ensuring optimal performance and seamless integration with the Excel file format.
To connect using this method, follow these steps:
- In the datasets namespace, choose the DBs tab and create a new provider by clicking Create new.
- Select the "Odbc Data Provider" in the "Provider" data field.
- For the "Database" field, choose Microsoft Excel Database.
- Click Ok.
- A new row is created in the data grid, click the ConnectionString column.
- In the pop-up window, enter the path and the filename in the "Dbq" field.
- Click the Test button to ensure that the connection is OK.
Note |
---|
"Test" is optional. |
Using ODBC with a DSN
Using ODBC with a DSN allows for centralized management and easier maintenance of data source connections, simplifying the configuration process.
To connect using this method, follow these steps:
- Go to your computer's Control Panel and select Administrative Tools.
- Double-click Data Sources (ODBC).
- In the "ODBC Data Source Administrator" window, click Add. You are prompted to select a driver.
- Select "Microsoft Excel Driver (*.xls)".
- Click Select Workbook and select the name of the Excel file that was previously created.
- Name the Data Source, (e.g., "excelDatasource").
- For Write access, uncheck the "ReadOnly" checkbox.
Using OleDB
OleDB enables a higher level of abstraction and flexibility when connecting to data sources, supporting a wider range of data access technologies and file formats.
To connect using this method, follow these steps:
- In the Datasets namespace, choose the DBs tab.
- Select the "OleDb data provider" option in the combo-box and create a new provider by clicking create new.
- Choose the "Microsoft Excel Database". Then, click OK.
- Click the ConnectionString column of the new row, then enter the path and the name of the Excel (.xls) file in the "DataSource" field.
In this section...
Page Tree | ||||
---|---|---|---|---|
|