You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

Overview

The Dataset Module connects your solution to databases through TServer services, managing data flow, security, and performance. This reference covers essential practices for production deployments, including concurrency management, security hardening, and performance optimization.

Architecture & Execution Model

How the Dataset Module Works

The Dataset Module operates as an intermediary layer between your solution and databases:

  • TServer Integration: All database operations route through TServer services
  • Execution Modes: Supports both synchronous (blocking) and asynchronous (non-blocking) operations
  • Result Processing: Returns .NET DataTable objects for flexible data manipulation
  • Distribution: Propagates data to displays, reports, scripts, and other modules

Synchronous vs Asynchronous Operations

Synchronous Execution:

  • Blocks execution until database responds
  • Returns DataTable directly to calling method
  • Stores result in Local Contents property
  • Best for: Small queries, immediate data needs

Asynchronous Execution:

  • Non-blocking operation
  • Updates Async Contents property when complete
  • Enables row-by-row navigation
  • Best for: Large datasets, UI responsiveness

Concurrency & Server Domain Management

Understanding Server Domain Attributes

All Dataset Module properties exist in the server domain, creating a shared resource environment:

Example Risk Scenario:
1. Client A sets: SQLStatement = "SELECT * FROM Orders WHERE Status='Open'"
2. Client B sets: SQLStatement = "SELECT * FROM Orders WHERE Status='Closed'"
3. Execute command runs with Client B's statement (last write wins)

Preventing Concurrency Conflicts

Strategy 1: Dedicated Query Objects

  • Create separate query objects for different operations
  • Assign unique queries to specific tasks or displays
  • Avoid sharing query objects between concurrent processes

Strategy 2: Synchronization Patterns

  • Use semaphores or locks in scripts
  • Implement request queuing for shared resources
  • Design state machines for complex operations

Strategy 3: Client-Side Processing

  • Execute queries in scripts with local variables
  • Process DataTables before assignment to tags
  • Minimize server domain property modifications

Data Management Strategies

DataTable Usage Patterns

The Dataset Module provides three primary patterns for DataTable management:

Pattern 1: Direct Script Processing

DataTable result = @Dataset.Query.Query1.SelectCommand();
// Process data locally without server domain impact
foreach(DataRow row in result.Rows) {
    // Local processing
}

Pattern 2: Tag Distribution

// Assign to DataTable tag for module sharing
@Tag.MyDataTable = @Dataset.Query.Query1.SelectCommand();
// Now available to displays, reports, etc.

Pattern 3: Mapped Navigation

// Configure mapping, then navigate rows
@Dataset.Query.Query1.Select();
@Dataset.Query.Query1.Next(); // Moves to next row

Memory & Traffic Optimization

Control Data Volume:

  • Always use WHERE clauses to filter results
  • Implement pagination for large datasets
  • Use SELECT specific columns instead of SELECT *
  • Consider indexed views for complex queries

Resource Planning:

Small Query: < 1,000 rows = Minimal impact
Medium Query: 1,000-10,000 rows = Monitor memory usage
Large Query: > 10,000 rows = Implement pagination

Security Implementation

SQL Injection Prevention

Never Do This:

string query = "SELECT * FROM Users WHERE Name = '" + userInput + "'";

Always Do This:

execute GetUserData @userName={Tag.UserInput}, @userId={Tag.UserId}

The platform's parameterization:

  • Treats all parameters as values, not code
  • Prevents malicious SQL execution
  • Maintains data type integrity
  • Supports all major database platforms

Network Security Architecture

Gateway Configuration for Restricted Databases:

  1. Identify Restriction: Database accessible only from specific servers
  2. Install Gateway: Deploy platform with TWebServer on authorized machine
  3. Configure ServerIP: Point Dataset connections to gateway machine
  4. Result: Secure database access through controlled gateway

Benefits:

  • Maintains network security policies
  • Centralizes database connections
  • Enables audit logging
  • Supports DMZ architectures

Database-Specific Configuration

Platform Compatibility

DatabaseSyntax ExampleSpecial Considerations
SQL ServerSELECT TOP 10 * FROM TableUse TOP for limiting
SQLiteSELECT * FROM Table LIMIT 10Use LIMIT clause
MySQLSELECT * FROM \Table` LIMIT 10`Backticks for names
PostgreSQLSELECT * FROM "Table" LIMIT 10Case-sensitive names
OracleSELECT * FROM Table WHERE ROWNUM <= 10ROWNUM for limiting

Time Zone Management

Default Behavior:

  • Platform stores all DateTime values as UTC
  • Historian and Alarm data always in UTC
  • Automatic conversion for display

Configuring External Databases:

DateTimeMode Settings:
- UTC: No conversion needed
- LocalTime: Platform converts automatically
- Custom: Handle in SQL statements

Local Time Queries:

-- Adjust for local time zone (example: EST -5 hours)
WHERE Timestamp >= DATEADD(hour, -5, @Tag.StartTimeUTC)

Performance Optimization

Query Optimization Checklist

? Indexes: Ensure indexes on filtered and joined columns ? Statistics: Update database statistics regularly ? Query Plans: Review execution plans for bottlenecks ? Connection Pooling: Enable for frequent operations ? Batch Operations: Group multiple operations when possible

Performance Monitoring

Key Metrics to Track:

  • Query execution time
  • Memory consumption
  • Network latency
  • Connection pool usage
  • Cache hit rates

Diagnostic Properties:

@Dataset.Query.Query1.Error        // Last error message
@Dataset.Query.Query1.ExecutionTime // Query duration
@Dataset.Query.Query1.RowCount      // Result size

Error Handling & Recovery

Error Detection

The module provides multiple error detection methods:

Method 1: Property Monitoring

@Dataset.Query.Query1.SelectCommand();
if (@Dataset.Query.Query1.Error != "") {
    // Handle error
}

Method 2: Status Methods

string status;
DataTable result = @Dataset.Query.Query1.SelectCommandWithStatusAsync(out status);
if (status != "OK") {
    // Handle error
}

Common Error Scenarios

Error TypeTypical CauseResolution
Connection TimeoutNetwork issues, server loadIncrease timeout, check connectivity
Syntax ErrorDatabase-specific SQLVerify syntax for target database
Permission DeniedInsufficient privilegesCheck database user permissions
DeadlockConcurrent transactionsImplement retry logic
Out of MemoryLarge result setAdd pagination, increase resources

Backup & Recovery

SQLite Backup Strategies

Option 1: Command Line Backup

sqlite3 source.db ".backup backup.db"
  • Simple and reliable
  • Requires database file access
  • Best for scheduled maintenance

Option 2: Online Backup API

  • Backup while database is active
  • Support for incremental backups
  • Progress monitoring capability

Option 3: File System Copy

  • Only when database is offline
  • Fastest for large databases
  • Requires downtime

Backup Best Practices

  1. Schedule: Automate backups during low-activity periods
  2. Verify: Test restore procedures monthly
  3. Rotate: Maintain multiple backup generations
  4. Secure: Store backups in separate physical location
  5. Document: Maintain restore procedure documentation

Store & Forward Limitations

The Store & Forward feature has specific requirements:

  • Applies Only To: Historian and Alarm databases
  • Required Schema: Must include control columns
  • Not Available For: Generic application databases
  • Alternative: Implement custom buffering for generic databases

For Store & Forward configuration, see Historian Archiving Process documentation.

Best Practices Summary

Design Principles

  1. Isolation: Use dedicated query objects for different operations
  2. Filtering: Always limit result sets with WHERE clauses
  3. Security: Use parameterized queries exclusively
  4. Monitoring: Track performance metrics and errors
  5. Planning: Design for concurrent access from the start

Production Checklist

Before deploying to production:

  • [ ] Parameterized all dynamic queries
  • [ ] Implemented error handling for all operations
  • [ ] Tested concurrent access scenarios
  • [ ] Configured appropriate timeouts
  • [ ] Established backup procedures
  • [ ] Documented recovery processes
  • [ ] Verified timezone handling
  • [ ] Optimized query performance
  • [ ] Planned for data growth
  • [ ] Secured network access

Related Documentation

  • Datasets Engine (Reference)
  • Datasets Runtime Attributes
  • Historian Archiving Process
  • TServer Configuration Guide
  • No labels