Skip to content

GitLab

  • Projects
  • Groups
  • Snippets
  • Help
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
K
kb
  • Project overview
    • Project overview
    • Details
    • Activity
    • Releases
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 2
    • Issues 2
    • List
    • Boards
    • Labels
    • Service Desk
    • Milestones
  • Merge requests 0
    • Merge requests 0
  • Operations
    • Operations
    • Incidents
  • Analytics
    • Analytics
    • Repository
    • Value Stream
  • Wiki
    • Wiki
  • Members
    • Members
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
Collapse sidebar
  • granite
  • kb
  • Wiki
    • Data_stream
    • Risk
  • risk_zhixing

risk_zhixing · Changes

Page history
update:数据清洗 authored Feb 08, 2022 by Liu Zhiqiang's avatar Liu Zhiqiang
Hide whitespace changes
Inline Side-by-side
Showing with 16 additions and 7 deletions
+16 -7
  • data_stream/risk/risk_zhixing.md data_stream/risk/risk_zhixing.md +16 -7
  • No files found.
data_stream/risk/risk_zhixing.md
View page @ dbbf2a82
......@@ -71,7 +71,8 @@ db_password:
```buildoutcfg
* 被执行人
* 根据表里面最大zhixing_id每天生成十万任务进行更新
* 每天发布四次任务,每次发布的zhixing_id为距今四天内且爬取失败次数小于30次且爬取结果为1101或null的zhixing_id
* 每次发布任务前,判断爬取结果为1000的最大zhixing_id与表内最大zhixing_id,若两个id只差小于七万,就补充差额的zhixing_id到loss表内
* 终本案件
* 根据表中zhongben_id的最大值每天生成十万任务
......@@ -667,9 +668,14 @@ index => "public-company-spider-data-%{log_date}"
# **数据清洗**
## 责任人
```
刘治强
```
## 代码地址
```angular2html
http://192.168.109.110/granite/project-collie-app/-/tree/master/app_risk/udms/risk_zhixing_to_redis
http://192.168.109.110/granite/project-collie-app/-/tree/master/app_risk/udms/risk_zxgk_loss
```
## 部署地址
<!--机器及线上代码地址-->
......@@ -678,15 +684,18 @@ index => "public-company-spider-data-%{log_date}"
<!--运行方法及运行命令、supervisor配置、supervisor的program等-->
- [ ] crontab + data_pump
- [X] crontab + data_pump
- [ ] supervisor + data_pump
- [ ] supervisor + consumer
## 数据接收来源
```angular2html
1.爬虫数据进行数据更新
2.代码生成新的执行id进行爬虫任务发布
```
<!--来源于kafka还是归集的文件、topic的group?-->
## 数据存储表地址
* 数据库地址:
* 表名:
\ No newline at end of file
* 数据库地址:bdp-rds-001.mysql.rds.aliyuncs.com
* 表名:risk_zhixing_loss
\ No newline at end of file
Clone repository
  • README
  • basic_guidelines
  • basic_guidelines
    • basic_guidelines
    • dev_guide
    • project_build
    • 开发流程
  • best_practice
  • best_practice
    • AlterTable
    • RDS
    • azkaban
    • create_table
    • design
    • elasticsearch
    • elasticsearch
      • ES运维
    • logstash
View All Pages