Skip to content

GitLab

  • Projects
  • Groups
  • Snippets
  • Help
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
K
kb
  • Project overview
    • Project overview
    • Details
    • Activity
    • Releases
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 2
    • Issues 2
    • List
    • Boards
    • Labels
    • Service Desk
    • Milestones
  • Merge requests 0
    • Merge requests 0
  • Operations
    • Operations
    • Incidents
  • Analytics
    • Analytics
    • Repository
    • Value Stream
  • Wiki
    • Wiki
  • Members
    • Members
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
Collapse sidebar
  • granite
  • kb
  • Wiki
    • Data_stream
    • Ic
  • ic_online_lake

ic_online_lake · Changes

Page history
update: ic库迁移线上流程文档 authored Aug 25, 2021 by 李子健's avatar 李子健
Hide whitespace changes
Inline Side-by-side
Showing with 64 additions and 0 deletions
+64 -0
  • data_stream/ic/ic_online_lake.md data_stream/ic/ic_online_lake.md +64 -0
  • No files found.
data_stream/ic/ic_online_lake.md 0 → 100644
View page @ 49eb9be5
# 流程文档
## 流程图
```plantuml
@startuml
file ic_crawler_bson_10.8.6.227 as bson_aliyun
file ic_crawler_bson_10.8.6.84 as bson_office
queue kafka的topic_ic_spider_all
database ic_ar
database ic_base
database ic_biz
file update_data_json
queue table_update_redis
file update_data_json_province
bson_aliyun --> bson_office: 共享盘/data_227/
bson_office --> kafka的topic_ic_spider_all: 所有的数据类型,bson_reader,kafka_writer
kafka的topic_ic_spider_all --> ic_ar: ar相关data_type,kafka_reader,sync_mysql_filter,redis_writer
kafka的topic_ic_spider_all --> ic_base: base相关data_type,kafka_reader,sync_mysql_filter,redis_writer
kafka的topic_ic_spider_all --> ic_biz: biz相关data_type,kafka_reader,sync_mysql_filter,redis_writer
ic_ar --> table_update_redis: 入库时将变化记录的主键写入redis
ic_base --> table_update_redis: 入库时将变化记录的主键写入redis
ic_biz --> table_update_redis: 入库时将变化记录的主键写入redis
table_update_redis --> update_data_json: redis_reader,udm_filter,file_writer
update_data_json --> update_data_json_province: 按省份分目录
@enduml
```
## 爬虫结果同步到mysql库
```
1.分data_type读到的所有bson文件写入kafka的同一topic:
部署地址:10.8.6.84
data_pump配置文件:
/home/collie/product/app_online_lake/data_pump/new_online/all_spider_update_lake.yml
supervisor配置:(29个进程)
/home/collie/product/app_online_lake/supervisor/ic_spider_sync_lake.conf
2.消费kafka更新mysql并写入redis:
部署地址:10.8.6.84
data_pump配置文件:
/home/collie/product/app_online_lake/data_pump/new_online/collie_all_spider_update_lake_kafka.yml
supervisor配置:(72个进程)
/home/collie/product/app_online_lake/supervisor/ic_spider_update_lake_kafka.conf
```
| data_type | topic | group | num_procs |
| ---------- | ------------- | ---------------------- | --------- |
| 所有的类型 | ic_spider_all | ic_spider_all_to_mysql | 72 |
## 增量数据读redis查mysql获取
```
部署地址:10.8.6.84
/home/collie/product/app_online_lake/data_pump/new_online/collie_all_spider_update_lake_redis.yml
supervisor配置:(37个进程)
/home/collie/product/app_online_lake/supervisor/ic_spider_update_lake_redis.conf
redis:bdp-mq-001.redis.rds.aliyuncs.com
db:1
```
##
\ No newline at end of file
Clone repository
  • README
  • basic_guidelines
  • basic_guidelines
    • basic_guidelines
    • dev_guide
    • project_build
    • 开发流程
  • best_practice
  • best_practice
    • AlterTable
    • RDS
    • azkaban
    • create_table
    • design
    • elasticsearch
    • elasticsearch
      • ES运维
    • logstash
View All Pages