During the creation of an OLAP model, a data log file makes a record at.
在olap模型创建阶段,日志文件会在以下位置生成记录。
Download the data log to a computer for long-term analysis and visualization.
下载数据记录到一台计算机长期分析及可视化。
It then looks at movement of log data from the kernel into user space.
然后,本文介绍了日志数据从内核到用户空间的移动过程。
Change log data and replication change data is more secure.
更改日志数据和复制更改数据更加安全。
Shut the database down and back up all data and all log and control files.
关闭数据库,备份所有数据以及所有日志和控制文件。
The analysis of log data can be rudimentary, such as simple summaries of basic activity.
对日志数据的分析可能只是初步的,例如基本活动的简单总结。
The more information in the log, the more powerful the data mining you can achieve.
日志中的信息越多,您所能实现数据挖掘就越强有力。
The secondary keeps data for each log file in a separate staging file.
次要服务器将每个日志文件中的数据保存在单独的暂存文件中。
Log message data on error scenarios.
记录关于错误场景的消息数据。
Avoid RAID 5 for data, indexes, and log.
对于数据、索引和日志,应避免RAID5。
Now I'll show you how to process large amounts of log data.
现在,讲解如何处理大量日志数据。
Listing 2: Sample code to collect data from the log file.
清单2:从日志文件中获取数据的范例代码。
HPEL also provides more flexible access to log and trace data.
HPEL还提供了更灵活的日志和跟踪数据访问。
You can derive the following data from the log entry.
您可以从该日志条目得出以下数据。
Add transient data definition for log files.
为日志文件添加临时数据定义。
The capture TAB shows the state of data capture and location in the logical log.
Capture选项卡显示数据捕获的状态和逻辑日志中的位置。
Based on this data, the log analytics package can create reports such as.
基于该数据,日志分析包可以创建报告,例如。
Writing a JDBC application to access audit log data.
编写JDBC应用程序来访问审核日志数据。
The size of the active log data sets.
活动日志的数据集的大小。
How many active log data sets are defined?
定义多少活动日志的数据集。
We need to maintain data information in the trace log for each device.
我们需要为每个设备维护数据信息,这通过跟踪日志完成。
Maintain shift log of production and other data and prepare production and other reports.
保留生产记录和其他数据转移,并准备生产报告和其他报告。
The development of geostress profile has opened a new field for log data application.
地应力剖面的研制为测井资料开辟了新的应用领域。
There are always any depth errors in log data.
测井曲线间的深度误差几乎是不可避免的。
Only process log file data in 24 hour increments.
仅处理在24小时内累积的记录文件数据。
Objective Log-linear model was applied to analyze the fracture data.
目的应用对数线性模型对骨折资料进行分析。
It is a simple and feasible approach to calculate rock drillability from log data.
利用测井资料求取岩石的可钻性是一种简便可行的途径。
The log response to igneous rock is characterized by comparing core data and logging information.
通过岩心资料与测井曲线的对比,搞清了风化店火山岩储集层的测井响应特征。
A reasonable procedure computing fractal dimension of log data has been given.
结合实际提出了测井数据分维计算的合理流程。
Oil and water bed identification is main content of second interpretation of log data.
油水层识别是测井二次解释的重要内容。
应用推荐