file-type

MariaDB temporal数据表的作用与应用指南

ZIP文件

下载需积分: 9 | 2KB | 更新于2025-01-30 | 121 浏览量 | 0 下载量 举报 收藏
download 立即下载
根据提供的文件信息,我们需要分析标题中提及的概念“temporal_tables_test:https:mariadb.comkbentemporal-data-tables有什么用?”以及相关的标签“TSQL”,同时考虑到压缩包子文件的文件名称“temporal_tables_test-master”。 知识点一:时间戳表(Temporal Data Tables) 在MariaDB数据库系统中,时间戳表是一种特殊的表,用于存储关于表中数据随时间变化的完整历史记录。这种设计在处理需要跟踪数据变更历史的场景时非常有用,比如审计、版本控制和数据恢复等。时间戳表通过记录时间范围内的数据版本来实现这一点,每个版本都有一个开始时间和一个结束时间。这些时间戳可以是数据库系统时间或者指定的有效时间范围。 知识点二:MariaDB中Temporal Data Tables的实现 在MariaDB中,时间戳表是通过使用两个隐藏的列来实现的,这两个列分别被称作有效时间列和系统时间列。有效时间列(Valid Time Column)用于存储记录的生命周期起始和结束时间,而系统时间列(System Time Column)则用于追踪系统时间的变化。这两列的组合使得MariaDB能够存储和检索表中数据的每个版本。 知识点三:时间戳表的应用场景 时间戳表在许多不同类型的业务场景中都有应用,例如: - 审计日志:记录数据的变更历史以便于事后审计或监控。 - 数据版本控制:允许用户访问数据在特定时间点的状态。 - 数据恢复:通过时间点的快照来恢复数据到历史状态。 - 数据分析:分析数据随时间的变化趋势和模式。 知识点四:T-SQL支持 T-SQL(Transact-SQL)是Microsoft SQL Server的扩展SQL实现,它支持一系列的增强功能,比如编程结构、错误处理和事务控制等。虽然MariaDB与Microsoft SQL Server是两个不同的数据库管理系统,但它们都支持SQL语言,许多概念和特性在两者之间是可以类比的。需要注意的是,MariaDB的时间戳表功能是特定于该数据库系统的,其具体的实现方式和语法可能会与Microsoft SQL Server有所不同。 知识点五:如何创建时间戳表 在MariaDB中创建时间戳表通常需要使用特定的数据定义语言(DDL)语句。例如,一个基本的时间戳表可能需要定义有效时间列和系统时间列,并指定这些列的数据类型。以下是一个简单的创建时间戳表的SQL示例: ```sql CREATE TABLE example_temporal_table ( id INT, data VARCHAR(100), start_time TIMESTAMP NOT NULL, end_time TIMESTAMP NOT NULL, PERIOD FOR time_period(start_time, end_time) ); ``` 在上述SQL语句中,`start_time` 和 `end_time` 定义了数据的生命周期,`PERIOD FOR` 语句则指定了用于时间范围的数据列。 知识点六:temporal_tables_test-master文件分析 根据文件名“temporal_tables_test-master”,我们可以推测这是一个包含时间戳表测试示例的项目文件。文件中可能包含用于演示时间戳表创建、数据插入、数据查询、数据版本控制等操作的SQL脚本。它可能是MariaDB开发者用于测试时间戳表特性的项目代码。 总结以上知识点,时间戳表是数据库系统中一种非常有用的特性,它通过记录和管理数据随时间的变化,为数据库的版本控制、审计和数据恢复提供了强大的支持。MariaDB作为流行的开源数据库系统之一,提供了时间戳表的功能,而TSQL作为SQL Server的实现语言,虽然与MariaDB的实现细节不同,但在时间戳表的概念理解上是相通的。开发人员和数据库管理员通过创建和管理时间戳表,可以更好地维护和利用数据的完整历史记录,提高数据治理的能力。

相关推荐

filetype

解析日志 07-19 12:30:28.403 1047 1305 10085 W ifunc_cam_twin: [update_afterRunTwin_Dma] WARNING: slave_num 1 but channel.size() 1 07-19 12:30:28.413 1047 1305 10036 E HalIspImp: [setP1Params] No ISP Profile: CaptureIntent: 1, Default Profile: 0 (setP1Params){#685:vendor/mediatek/proprietary/hardware/mtkcam/aaa/source/isp_6s/isp_hal/HalIspImp.cpp} 07-19 12:30:28.417 1047 1305 10036 D ltm_vhdralgo_temporal_smooth: [ltm_vhdr_smooth_33pts_remapping_old] error: AE result - cwv 07-19 12:30:28.456 1047 1305 10085 W ifunc_cam_twin: [update_afterRunTwin_Dma] WARNING: slave_num 1 but channel.size() 1 07-19 12:30:28.465 1000 922 954 I OriginChannel: wise light debug, data[0].type is 65627,data[0].sensor is 91:[1827.000000,,200.000000,,10.000000,1067.000000,66554969615646] 07-19 12:30:28.466 1047 1305 10036 E HalIspImp: [setP1Params] No ISP Profile: CaptureIntent: 1, Default Profile: 0 (setP1Params){#685:vendor/mediatek/proprietary/hardware/mtkcam/aaa/source/isp_6s/isp_hal/HalIspImp.cpp} 07-19 12:30:28.471 1047 1305 10036 D ltm_vhdralgo_temporal_smooth: [ltm_vhdr_smooth_33pts_remapping_old] error: AE result - cwv 07-19 12:30:28.472 1047 1305 10036 E ShadingTrans_core: [TSF_Plus] TSF ext para is 0 and VRA is 0 07-19 12:30:28.503 1047 1305 10085 W ifunc_cam_twin: [update_afterRunTwin_Dma] WARNING: slave_num 1 but channel.size() 1 07-19 12:30:28.510 1047 1305 10036 E HalIspImp: [setP1Params] No ISP Profile: CaptureIntent: 1, Default Profile: 0 (setP1Params){#685:vendor/mediatek/proprietary/hardware/mtkcam/aaa/source/isp_6s/isp_hal/HalIspImp.cpp} 07-19 12:30:28.511 1047 1305 10036 D ltm_vhdralgo_temporal_smooth: [ltm_vhdr_smooth_33pts_remapping_old] error: AE result - cwv

filetype

Global: debug: false use_gpu: true epoch_num: 50 log_smooth_window: 20 print_batch_step: 10 save_model_dir: ./output/rec_ppocr_v4 save_epoch_step: 10 eval_batch_step: - 0 - 2000 cal_metric_during_train: true pretrained_model: D:\AI图片快速朔源\paddleOCR\PaddleOCR-release-2.8\pretrain_modles\en_PP-OCRv4_rec_train\best_accuracy.pdparams ignore_weights: ["Head.fc.weight", "Head.fc.bias"] checkpoints: null save_inference_dir: null use_visualdl: false infer_img: D:\AI图片快速朔源\paddleOCR\PaddleOCR-release-2.8\train_data\rec\train\20250311160809.jpg character_dict_path: ppocr/utils/en_dict.txt max_text_length: 30 infer_mode: false use_space_char: false distributed: false save_res_path: ./output/rec/predicts_ppocrv3.txt Optimizer: name: Adam beta1: 0.9 beta2: 0.999 lr: name: Cosine learning_rate: 0.0005 warmup_epoch: 5 regularizer: name: L2 factor: 3.0e-05 Architecture: model_type: rec algorithm: SVTR_LCNet Transform: null Backbone: name: PPLCNetV3 scale: 0.95 Head: name: CTCHead num_classes: 39 Neck: name: svtr dims: 480 depth: 2 hidden_dims: 480 kernel_size: - 1 - 3 use_guide: true Head: fc_decay: 1.0e-05 Loss: name: CTCLoss PostProcess: name: CTCLabelDecode Metric: name: RecMetric main_indicator: acc ignore_space: false Train: dataset: name: SimpleDataSet data_dir: D:\AI图片快速朔源\paddleOCR\PaddleOCR-release-2.8 label_file_list: - D:\AI图片快速朔源\paddleOCR\PaddleOCR-release-2.8\train_data\rec\train.txt transforms: - DecodeImage: img_mode: BGR channel_first: false - RecResizeImg: image_shape: [3, 48, 320] padding: true - KeepKeys: keep_keys: - image - label loader: shuffle: true batch_size_per_card: 96 drop_last: true num_workers: 8 Eval: dataset: name: SimpleDataSet data_dir: D:\AI图片快速朔源\paddleOCR\Pad

filetype

# test.py import torch import util_S import numpy as np import argparse from baseline_methods_S import test_error # from model import STPN # model.py is implicitly loaded via torch.load # --- 新增 MinMaxScaler 类 --- class MinMaxScaler: """一个简单的 MinMaxScaler 实现""" def __init__(self, min_val, max_val): self.min = min_val self.max = max_val def transform(self, data): return (data - self.min) / (self.max - self.min + 1e-8) def inverse_transform(self, data): return data * (self.max - self.min + 1e-8) + self.min parser = argparse.ArgumentParser() parser.add_argument('--device', type=str, default='cpu:0', help='') parser.add_argument('--data', type=str, default='China', help='data type') parser.add_argument("--train_val_ratio", nargs="+", default=[0.7, 0.1], help='train/val ratio', type=float) parser.add_argument('--in_len', type=int, default=36, help='input time series length') parser.add_argument('--out_len', type=int, default=12, help='output time series length') parser.add_argument('--period', type=int, default=36, help='periodic for temporal embedding') parser.add_argument('--period1', type=int, default=7, help='periodicity for a week') args = parser.parse_args() def main(): device = torch.device(args.device) adj, training_data, val_data, test_data, _, _, test_w = util_S.load_data(args.data, args.train_val_ratio) supports = [torch.tensor(i).to(device) for i in adj] # --- 修改点: 使用 MinMaxScaler --- # 使用训练集的统计量来初始化scaler data_max = training_data[~np.isnan(training_data)].max() data_min = training_data[~np.isnan(training_data)].min() scaler = MinMaxScaler(data_min, data_max) test_index = list(range(test_data.shape[1] - (args.in_len + args.out_len))) label = [] for i in range(len(test_index)): label.append( np.expand_dims(test_data[:, test_index[i] + args.in_len:test_index[i] + args.in_len + args.out_len, :], axis=0)) label = np.concatenate(label) model = torch.load("stpn_traffic_" + args.data + ".pth") model.to(device) outputs = [] model.eval() for i in range(len(test_index)): testx = np.expand_dims(test_data[:, test_index[i]: test_index[i] + args.in_len, :], axis=0) testx = scaler.transform(testx) # 标准化测试输入 testw = np.expand_dims(test_w[:, test_index[i]: test_index[i] + args.in_len], axis=0) testw = torch.LongTensor(testw).to(device) testx[np.isnan(testx)] = 0 base_time_idx = training_data.shape[1] + val_data.shape[1] + test_index[i] ti_add = (np.arange(base_time_idx, base_time_idx + args.in_len) // args.period % args.period1) / args.period1 testti = (np.arange(base_time_idx, base_time_idx + args.in_len) % args.period) * np.ones([1, args.in_len]) / ( args.period - 1) + ti_add to_add = (np.arange(base_time_idx + args.in_len, base_time_idx + args.in_len + args.out_len) // args.period % args.period1) / args.period1 testto = (np.arange(base_time_idx + args.in_len, base_time_idx + args.in_len + args.out_len) % args.period) * np.ones([1, args.out_len]) / ( args.period - 1) + to_add testx = torch.Tensor(testx).to(device).permute(0, 3, 1, 2) testti = torch.Tensor(testti).to(device) testto = torch.Tensor(testto).to(device) output = model(testx, testti, supports, testto, testw) output = output.permute(0, 2, 3, 1) output = output.detach().cpu().numpy() output = scaler.inverse_transform(output) # 反归一化 outputs.append(output) yhat = np.concatenate(outputs) # --- 修改点: 评估前四舍五入并更新日志 --- yhat_rounded = np.round(yhat) print("\n--- Test Results for Flight Traffic Prediction ---") log_template = '{prefix}, Test MAE: {mae:.2f} flights, Test R2: {r2:.4f}, Test RMSE: {rmse:.2f} flights' # 评估抵达流量 (channel 0) mae, rmse, r2 = test_error(yhat_rounded[:, :, 2, 0], label[:, :, 2, 0]) print(log_template.format(prefix=' 3 step ahead arrival traffic ', mae=mae, r2=r2, rmse=rmse)) mae, rmse, r2 = test_error(yhat_rounded[:, :, 5, 0], label[:, :, 5, 0]) print(log_template.format(prefix=' 6 step ahead arrival traffic ', mae=mae, r2=r2, rmse=rmse)) mae, rmse, r2 = test_error(yhat_rounded[:, :, 11, 0], label[:, :, 11, 0]) print(log_template.format(prefix='12 step ahead arrival traffic ', mae=mae, r2=r2, rmse=rmse)) print("-" * 50) # 评估起飞流量 (channel 1) mae, rmse, r2 = test_error(yhat_rounded[:, :, 2, 1], label[:, :, 2, 1]) print(log_template.format(prefix=' 3 step ahead departure traffic', mae=mae, r2=r2, rmse=rmse)) mae, rmse, r2 = test_error(yhat_rounded[:, :, 5, 1], label[:, :, 5, 1]) print(log_template.format(prefix=' 6 step ahead departure traffic', mae=mae, r2=r2, rmse=rmse)) mae, rmse, r2 = test_error(yhat_rounded[:, :, 11, 1], label[:, :, 11, 1]) print(log_template.format(prefix='12 step ahead departure traffic', mae=mae, r2=r2, rmse=rmse)) if __name__ == "__main__": main() 以上错误是这个代码的运行结果,此外我还有两个util_S.py和model.py,我要这怎么修改才能完整运行

filetype

usage: control_robot.py [-h] [--config_path str] [--robot str] [--robot.type {aloha,koch,koch_bimanual,moss,so101,so100,stretch,lekiwi}] [--robot.gripper_open_degree str] [--robot.max_relative_target str] [--robot.ip str] [--robot.port str] [--robot.video_port str] [--robot.cameras str] [--robot.calibration_dir str] [--robot.leader_arms str] [--robot.follower_arms str] [--robot.teleop_keys str] [--robot.mock str] [--control str] [--control.type {calibrate,teleoperate,record,replay,remote_robot}] [--control.arms str] [--control.teleop_time_s str] [--control.single_task str] [--policy str] [--control.policy.type {act,diffusion,pi0,smolvla,tdmpc,vqbet,pi0fast}] [--control.policy.replace_final_stride_with_dilation str] [--control.policy.pre_norm str] [--control.policy.dim_model str] [--control.policy.n_heads str] [--control.policy.dim_feedforward str] [--control.policy.feedforward_activation str] [--control.policy.n_encoder_layers str] [--control.policy.n_decoder_layers str] [--control.policy.use_vae str] [--control.policy.n_vae_encoder_layers str] [--control.policy.temporal_ensemble_coeff str] [--control.policy.kl_weight str] [--control.policy.optimizer_lr_backbone str] [--control.policy.drop_n_last_frames str] [--control.policy.use_separate_rgb_encoder_per_camera str] [--control.policy.down_dims str] [--control.policy.kernel_size str] [--control.policy.n_groups str] [--control.policy.diffusion_step_embed_dim str] [--control.policy.use_film_scale_modulation str] [--control.policy.noise_scheduler_type str] [--control.policy.num_train_timesteps str] [--control.policy.beta_schedule str] [--control.policy.beta_start str] [--control.policy.beta_end str] [--control.policy.prediction_type str] [--control.policy.clip_sample str] [--control.policy.clip_sample_range str] [--control.policy.num_inference_steps str] [--control.policy.do_mask_loss_for_padding str] [--control.policy.scheduler_name str] [--control.policy.attention_implementation str] [--control.policy.num_steps str] [--control.policy.train_expert_only str] [--control.policy.train_state_proj str] [--control.policy.optimizer_grad_clip_norm str] [--control.policy.vlm_model_name str] [--control.policy.load_vlm_weights str] [--control.policy.add_image_special_tokens str] [--control.policy.attention_mode str] [--control.policy.prefix_length str] [--control.policy.pad_language_to str] [--control.policy.num_expert_layers str] [--control.policy.num_vlm_layers str] [--control.policy.self_attn_every_n_layers str] [--control.policy.expert_width_multiplier str] [--control.policy.min_period str] [--control.policy.max_period str] [--control.policy.n_action_repeats str] [--control.policy.horizon str] [--control.policy.image_encoder_hidden_dim str] [--control.policy.state_encoder_hidden_dim str] [--control.policy.latent_dim str] [--control.policy.q_ensemble_size str] [--control.policy.mlp_dim str] [--control.policy.discount str] [--control.policy.use_mpc str] [--control.policy.cem_iterations str] [--control.policy.max_std str] [--control.policy.min_std str] [--control.policy.n_gaussian_samples str] [--control.policy.n_pi_samples str] [--control.policy.uncertainty_regularizer_coeff str] [--control.policy.n_elites str] [--control.policy.elite_weighting_temperature str] [--control.policy.gaussian_mean_momentum str] [--control.policy.max_random_shift_ratio str] [--control.policy.reward_coeff str] [--control.policy.expectile_weight str] [--control.policy.value_coeff str] [--control.policy.consistency_coeff str] [--control.policy.advantage_scaling str] [--control.policy.pi_coeff str] [--control.policy.temporal_decay_coeff str] [--control.policy.target_model_momentum str] [--control.policy.n_action_pred_token str] [--control.policy.action_chunk_size str] [--control.policy.vision_backbone str] [--control.policy.crop_shape str] [--control.policy.crop_is_random str] [--control.policy.pretrained_backbone_weights str] [--control.policy.use_group_norm str] [--control.policy.spatial_softmax_num_keypoints str] [--control.policy.n_vqvae_training_steps str] [--control.policy.vqvae_n_embed str] [--control.policy.vqvae_embedding_dim str] [--control.policy.vqvae_enc_hidden_dim str] [--control.policy.gpt_block_size str] [--control.policy.gpt_input_dim str] [--control.policy.gpt_output_dim str] [--control.policy.gpt_n_layer str] [--control.policy.gpt_n_head str] [--control.policy.gpt_hidden_dim str] [--control.policy.dropout str] [--control.policy.mlp_hidden_dim str] [--control.policy.offset_loss_weight str] [--control.policy.primary_code_loss_weight str] [--control.policy.secondary_code_loss_weight str] [--control.policy.bet_softmax_temperature str] [--control.policy.sequentially_select str] [--control.policy.optimizer_vqvae_lr str] [--control.policy.optimizer_vqvae_weight_decay str] [--control.policy.n_obs_steps str] [--control.policy.normalization_mapping str] [--control.policy.input_features str] [--control.policy.output_features str] [--control.policy.device str] [--control.policy.use_amp str] [--control.policy.chunk_size str] [--control.policy.n_action_steps str] [--control.policy.max_state_dim str] [--control.policy.max_action_dim str] [--control.policy.resize_imgs_with_padding str] [--control.policy.interpolate_like_pi str] [--control.policy.empty_cameras str] [--control.policy.adapt_to_pi_aloha str] [--control.policy.use_delta_joint_actions_aloha str] [--control.policy.tokenizer_max_length str] [--control.policy.proj_width str] [--control.policy.max_decoding_steps str] [--control.policy.fast_skip_tokens str] [--control.policy.max_input_seq_len str] [--control.policy.use_cache str] [--control.policy.freeze_vision_encoder str] [--control.policy.freeze_lm_head str] [--control.policy.optimizer_lr str] [--control.policy.optimizer_betas str] [--control.policy.optimizer_eps str] [--control.policy.optimizer_weight_decay str] [--control.policy.scheduler_warmup_steps str] [--control.policy.scheduler_decay_steps str] [--control.policy.scheduler_decay_lr str] [--control.policy.checkpoint_path str] [--control.policy.padding_side str] [--control.policy.precision str] [--control.policy.grad_clip_norm str] [--control.policy.relaxed_action_decoding str] [--control.warmup_time_s str] [--control.episode_time_s str] [--control.reset_time_s str] [--control.num_episodes str] [--control.video str] [--control.push_to_hub str] [--control.private str] [--control.tags str] [--control.num_image_writer_processes str] [--control.num_image_writer_threads_per_camera str] [--control.resume str] [--control.repo_id str] [--control.episode str] [--control.root str] [--control.fps str] [--control.play_sounds str] [--control.log_interval str] [--control.display_data str] [--control.viewer_ip str] [--control.viewer_port str] control_robot.py: error: unrecognized arguments: --control.local_files_only=true

filetype
filetype

org.hibernate.tool.schema.spi.CommandAcceptanceException: Error executing DDL "alter table dingdan_chanpin add column last_updated datetime not null" via JDBC Statement at org.hibernate.tool.schema.internal.exec.GenerationTargetToDatabase.accept(GenerationTargetToDatabase.java:67) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.tool.schema.internal.AbstractSchemaMigrator.applySqlString(AbstractSchemaMigrator.java:562) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.tool.schema.internal.AbstractSchemaMigrator.applySqlStrings(AbstractSchemaMigrator.java:583) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.tool.schema.internal.AbstractSchemaMigrator.migrateTable(AbstractSchemaMigrator.java:297) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.tool.schema.internal.GroupedSchemaMigratorImpl.performTablesMigration(GroupedSchemaMigratorImpl.java:75) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.tool.schema.internal.AbstractSchemaMigrator.performMigration(AbstractSchemaMigrator.java:207) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.tool.schema.internal.AbstractSchemaMigrator.doMigration(AbstractSchemaMigrator.java:114) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.tool.schema.spi.SchemaManagementToolCoordinator.performDatabaseAction(SchemaManagementToolCoordinator.java:184) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.tool.schema.spi.SchemaManagementToolCoordinator.process(SchemaManagementToolCoordinator.java:73) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.internal.SessionFactoryImpl.<init>(SessionFactoryImpl.java:318) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.boot.internal.SessionFactoryBuilderImpl.build(SessionFactoryBuilderImpl.java:468) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:1259) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:58) ~[spring-orm-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365) ~[spring-orm-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:391) ~[spring-orm-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:378) ~[spring-orm-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.afterPropertiesSet(LocalContainerEntityManagerFactoryBean.java:341) ~[spring-orm-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1858) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1795) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:594) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:516) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:324) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:322) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1109) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:551) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:143) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:755) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:747) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:402) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:312) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE] at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.run(SpringBootServletInitializer.java:173) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE] at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:153) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE] at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:95) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE] at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:172) ~[spring-web-5.2.15.RELEASE.jar:5.2.15.RELEASE] at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5161) ~[catalina.jar:9.0.46] at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) ~[catalina.jar:9.0.46] at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:717) ~[catalina.jar:9.0.46] at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:690) ~[catalina.jar:9.0.46] at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:692) ~[catalina.jar:9.0.46] at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1184) ~[catalina.jar:9.0.46] at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1925) ~[catalina.jar:9.0.46] at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) ~[na:na] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na] at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75) ~[tomcat-util.jar:9.0.46] at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:123) ~[na:na] at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:1094) ~[catalina.jar:9.0.46] at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:476) ~[catalina.jar:9.0.46] at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1611) ~[catalina.jar:9.0.46] at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:319) ~[catalina.jar:9.0.46] at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:123) ~[catalina.jar:9.0.46] at org.apache.catalina.util.LifecycleBase.setStateInternal(LifecycleBase.java:423) ~[catalina.jar:9.0.46] at org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.java:366) ~[catalina.jar:9.0.46] at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:936) ~[catalina.jar:9.0.46] at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:829) ~[catalina.jar:9.0.46] at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) ~[catalina.jar:9.0.46] at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1384) ~[catalina.jar:9.0.46] at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1374) ~[catalina.jar:9.0.46] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na] at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75) ~[tomcat-util.jar:9.0.46] at java.base/java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:145) ~[na:na] at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:909) ~[catalina.jar:9.0.46] at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:262) ~[catalina.jar:9.0.46] at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) ~[catalina.jar:9.0.46] at org.apache.catalina.core.StandardService.startInternal(StandardService.java:433) ~[catalina.jar:9.0.46] at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) ~[catalina.jar:9.0.46] at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:930) ~[catalina.jar:9.0.46] at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) ~[catalina.jar:9.0.46] at org.apache.catalina.startup.Catalina.start(Catalina.java:772) ~[catalina.jar:9.0.46] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[na:na] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na] at java.base/java.lang.reflect.Method.invoke(Method.java:568) ~[na:na] at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:342) ~[bootstrap.jar:9.0.46] at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:473) ~[bootstrap.jar:9.0.46] Caused by: com.mysql.cj.jdbc.exceptions.MysqlDataTruncation: Data truncation: Incorrect datetime value: '0000-00-00 00:00:00' for column 'last_updated' at row 1 at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:104) ~[mysql-connector-j-8.0.33.jar:8.0.33] at com.mysql.cj.jdbc.StatementImpl.executeInternal(StatementImpl.java:763) ~[mysql-connector-j-8.0.33.jar:8.0.33] at com.mysql.cj.jdbc.StatementImpl.execute(StatementImpl.java:648) ~[mysql-connector-j-8.0.33.jar:8.0.33] at com.zaxxer.hikari.pool.ProxyStatement.execute(ProxyStatement.java:95) ~[HikariCP-3.4.5.jar:na] at com.zaxxer.hikari.pool.HikariProxyStatement.execute(HikariProxyStatement.java) ~[HikariCP-3.4.5.jar:na] at org.hibernate.tool.schema.internal.exec.GenerationTargetToDatabase.accept(GenerationTargetToDatabase.java:54) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final] ... 76 common frames omitted

filetype

将以下vega-lite代码转换为vega代码: { "$schema": "https://vega.github.io/schema/vega-lite/v5.json", "data": { "url": { "%context%": true, "%timefield%": "Base.CreatedAt", "index": "security_score", "body": { "aggs": { "test_buckets": { "filter": { "terms": { "ScoreType.keyword": ["TotalScore", "impactScore"] } }, "aggs": { "time_buckets": { "date_histogram": { "field": "Base.CreatedAt", "fixed_interval": "1m", "extended_bounds": { "min": {"%timefilter%": "min"}, "max": {"%timefilter%": "max"} }, "min_doc_count": 1 }, "aggs": { "score_sum": { "sum": { "script": { "source": "if(doc['ScoreType.keyword'].value == 'TotalScore') {return doc['Score'].value}", "lang": "painless" } } }, "impact_data_GeneratorId": { "terms": { "field": "CompleteInfo.Base.GeneratorId.keyword", "size": 10000 } }, "impact_data_FixTime": { "terms": { "field": "CompleteInfo.Score.FixTime.keyword", "size": 10000 } } } } } } }, "size": 0 } }, "format": {"property": "aggregations.test_buckets.time_buckets.buckets"} }, "transform": [ { "calculate": "max(0, datum.score_sum.value)", "as": "final_score" } { "calculate": "datum.impact_data_GeneratorId.buckets", "as": "GeneratorId" }, { "calculate": "datum.impact_data_FixTime.buckets", "as": "FixTime" } ], "layer": [ { "mark": {"type": "line", "color": "#32CD32"} }, { "mark": {"type": "point", "filled": true, "color": "#FF3030", "size": 50} } ], "encoding": { "x": { "field": "key", "type": "temporal", "title": "Time", "axis": {"format": "%Y-%m-%d"} }, "y": { "field": "final_score", "type": "quantitative", "title": "Security Score", "scale": {"domain": [0, 100]} }, "tooltip": [ {"field": "key", "type": "temporal", "title": "时间", "format": "%Y-%m-%d %H:%M"}, {"field": "final_score", "title": "最终分数", "format": ".2f"}, {"field": "GeneratorId", "title": "事件类型"}, {"field": "FixTime", "title": "修复时间"} ] }, autosize: { type: fit contains: content resize: true } }

filetype

2025-03-19 09:57:47,513 ERROR o.a.j.f.TimeShift: Failed to parse the date '2025-03-19T09:57:47.512Z' to shift with formatter 'Value(YearOfEra,4,19,EXCEEDS_PAD)'-'Value(MonthOfYear,2)'-'Value(DayOfMonth,2)'T'Value(HourOfDay,2)':'Value(MinuteOfHour,2)':'Value(SecondOfMinute,2)'.'Fraction(NanoOfSecond,3,3)'Z'java.time.format.DateTimeFormatterBuilder$DefaultValueParser@4722713djava.time.format.DateTimeFormatterBuilder$DefaultValueParser@63a37897java.time.format.DateTimeFormatterBuilder$DefaultValueParser@3f50b943java.time.format.DateTimeFormatterBuilder$DefaultValueParser@3214110ejava.time.format.DateTimeFormatterBuilder$DefaultValueParser@666b25e8java.time.format.DateTimeFormatterBuilder$DefaultValueParser@1e749e72java.time.format.DateTimeFormatterBuilder$DefaultValueParser@2454b65ejava.time.format.DateTimeFormatterBuilder$DefaultValueParser@365a4fbbjava.time.format.DateTimeFormatterBuilder$DefaultValueParser@9a178fd' java.time.format.DateTimeParseException: Text '2025-03-19T09:57:47.512Z' could not be parsed: Conflict found: NanoOfSecond 512000000 differs from NanoOfSecond 0 while resolving MilliOfSecond at java.time.format.DateTimeFormatter.createError(Unknown Source) ~[?:1.8.0_441] at java.time.format.DateTimeFormatter.parse(Unknown Source) ~[?:1.8.0_441] at java.time.ZonedDateTime.parse(Unknown Source) ~[?:1.8.0_441] at org.apache.jmeter.functions.TimeShift.execute(TimeShift.java:161) [ApacheJMeter_functions.jar:5.4.2] at org.apache.jmeter.engine.util.CompoundVariable.execute(CompoundVariable.java:138) [ApacheJMeter_core.jar:5.4.2] at org.apache.jmeter.engine.util.CompoundVariable.execute(CompoundVariable.java:113) [ApacheJMeter_core.jar:5.4.2] at org.apache.jmeter.testelement.property.FunctionProperty.getStringValue(FunctionProperty.java:100) [ApacheJMeter_core.jar:5.4.2] at org.apache.jmeter.testelement.AbstractTestElement.getPropertyAsString(AbstractTestElement.java:280) [ApacheJMeter_core.jar:5.4.2] at org.apache.jmeter.protocol.http.sampler.HTTPSampl

AaronGary
  • 粉丝: 38
上传资源 快速赚钱