Error encountered when writing DAQmx data with scale information
kimbyungnam opened this issue · comments
Are there any guidelines for writing data with scale information?
In Writting TDMS files tutorial, an error occurs in the last code snippet when the original file contains scale information.
Example Code
from nptdms import TdmsFile, TdmsWriter, RootObject
original_file = TdmsFile("original_file.tdms")
original_groups = original_file.groups()
original_channels = [chan for group in original_groups for chan in group.channels()]
with TdmsWriter("copied_file.tdms") as copied_file:
root_object = RootObject(original_file.properties)
channels_to_copy = [chan for chan in original_channels if include_channel(chan)]
copied_file.write_segment([root_object] + original_groups + channels_to_copy)
Error Log
---------------------------------------------------------------------------
error Traceback (most recent call last)
Cell In[10], line 10
8 root_object = RootObject(original_file.properties)
9 channels_to_copy = [chan for chan in original_channels]
---> 10 copied_file.write_segment([root_object] + original_groups + channels_to_copy)
File f:\2023\TEMP\park\venv\lib\site-packages\nptdms\writer.py:127, in TdmsWriter.write_segment(self, objects)
122 """ Write a segment of data to a TDMS file
123
124 :param objects: A list of TdmsObject instances to write
125 """
126 segment = TdmsSegment(objects, version=self._tdms_version)
--> 127 segment.write(self._file)
129 if self._index_file is not None:
130 segment = TdmsSegment(objects, is_index_file=True, version=self._tdms_version)
File f:\2023\TEMP\park\venv\lib\site-packages\nptdms\writer.py:161, in TdmsSegment.write(self, file)
160 def write(self, file):
--> 161 metadata = self.metadata()
162 metadata_size = sum(len(val.bytes) for val in metadata)
164 toc = ['kTocMetaData', 'kTocRawData', 'kTocNewObjList']
File f:\2023\TEMP\park\venv\lib\site-packages\nptdms\writer.py:177, in TdmsSegment.metadata(self)
175 for obj in self.objects:
176 metadata.append(String(obj.path))
--> 177 metadata.extend(self.raw_data_index(obj))
178 properties = read_properties_dict(obj.properties)
179 num_properties = len(properties)
File f:\2023\TEMP\park\venv\lib\site-packages\nptdms\writer.py:189, in TdmsSegment.raw_data_index(self, obj)
187 def raw_data_index(self, obj):
188 if hasattr(obj, 'data'):
--> 189 data_type = Int32(obj.data_type.enum_value)
190 dimension = Uint32(1)
191 num_values = Uint64(len(obj.data))
File f:\2023\TEMP\park\venv\lib\site-packages\nptdms\types.py:97, in StructType.__init__(self, value)
95 def __init__(self, value):
96 self.value = value
---> 97 self.bytes = _struct_pack('<' + self.struct_declaration, value)
error: argument out of range
The code below, however, works without any errors.
channels_to_copy = [
ChannelObject(group.name, chan.name, chan.data, dict(tc.properties, **{"NI_Scaling_Status" : "scaled"}))
for group in original_groups for chan in group.channels()
]
I am attaching the channel properties of my TDMS file.
Property,Dev1/ai2,Datatype
NI_ChannelLength,100,U64
NI_ChannelName,Dev1/ai0,String
NI_DataType,10,U16
NI_Number_Of_Scales,2,I32
NI_Scale[1]_Polynomial_Coefficients[0],0.000366,Double Float
NI_Scale[1]_Polynomial_Coefficients[1],0.00032,Double Float
NI_Scale[1]_Polynomial_Coefficients[2],0,Double Float
NI_Scale[1]_Polynomial_Coefficients[3],-0,Double Float
NI_Scale[1]_Polynomial_Coefficients_Size,4,I32
NI_Scale[1]_Polynomial_Input_Source,0,I32
NI_Scale[1]_Scale_Type,Polynomial,String
NI_Scaling_Status,scaled,String
NI_UnitDescription,Volts,String
unit_string,Volts,String
wf_increment,0.00018,Double Float
wf_samples,1,I32
wf_start_offset,0,Double Float
wf_start_time,2023-05-16 11:48:18.510502,DateTime
Hi @kimbyungnam, copying DAQmx channels like this isn't supported yet. It looks like what's happening is the 0xFFFFFFFF
value assigned as the data type for DAQmx data is overflowing the Int32 type used to write the raw data index information. Changing that to be a UInt32 instead should fix this error but the result won't be correct, as npTDMS will read the scaled data rather than the raw data, so when the file is read back the scaling will be applied twice. So you probably want to create a channel with the raw data instead.
This should probably be changed to at least give a more helpful error message.