Title: Data Push Model and Coherence Protocol Research and Design
Technical Area: Resource Management
In the industry and technology trends of the industry's internet, cloud computing, big data, scheduling and containerization, the size of machines and containers in data centers is increasing. In the field of distributed computing, new technologies such as Service Mesh have emerged. Among the technologies and scenarios of service discovery, service health monitoring and service metadata delivery, service configuration change push and distribution, agent control and management, dynamic domain name resolution services, and remote live, data center large-scale events and data push and distribution are the basis.
Under the trend of extreme container scale in the future, the coherence agreements and data distribution mechanisms behind it face the key technical challenges such as capacity, performance, and high reliability. The traditional data distribution mechanism cannot guarantee the service quality, and it is urgent to study the scenario for extreme-scale data distribution. New Coherence Agreements, Data Distribution Models, Mechanisms, and Key Technologies for Service Quality Assurance.
We have found that the data distribution strategy based on push model still has large-scale space for creation and optimization under extreme scale, such as machine learning for network topology, timely feedback and tracking based on monitorable and push efficiency and quality, so we are target to
- Developing a New Data Distribution and Coherence Protocol for Large Scale Push Models
- Create New Efficient Data Distribution and Consistency Guarantee Model: Large-scale data center environment model construction, environmental monitoring, including statistical analysis, failure rate, etc., based on the environment model to build data distribution model; establish data for extreme-scale data distribution (weak) Consistency model theory, and mechanisms and methods to quickly achieve consistency.
Related Research Topics
1. Intelligent Data Distribution Strategy Generation and Verification Mechanisms
Targeting specific SLA requirements and data distribution models, intelligently generating data partitioning strategies, and performing dynamic quantitative analysis and validation.
2. Reliability Data Distribution Mechanism
Establishing an ACK mechanism for the data distribution and propagation process, violation of SLA node anomaly detection and isolation mechanisms, and ensuring the reliable arrival of data at normal nodes.