Skip to content

Latest commit

 

History

History
272 lines (251 loc) · 12.5 KB

datera.md

File metadata and controls

272 lines (251 loc) · 12.5 KB

DATERA

Datera was co-founded in 2013 by contributors to open- source LIO_(SCSI_target) storage, Marc Fleischmann, Nicholas Bellinger and Claudio Fleiner. In 2016, Datera emerged from stealth and raised $40 million in funding from Khosla Ventures, Samsung Ventures, Andy Bechtolsheim, and Pradeep Sindhu. Datera partnered with open source private cloud platform, vScaler in 2017 to deliver scalable private clouds for a range of workloads from high-performance databases to archival storage. Datera is a global enterprise software company headquartered in Santa Clara, California that developed an enterprise software-defined storage platform. Datera was acquired by VMware in April 2021. Datera provides the perfect storage for database acceleration and providing a database as a service for their customers. Datera is recognized by network world as a Hot storage Company to watch,CRN as a Top Software Defined Data Center Provider and the telecom council as a service provider innovation award winner. Datera data services platform is a set of automated data services that includes data compression,snapshots and replications to manage the data across nodes. websites www.datera.io Organization/Foundation

Name Datera

License

Open/Proprietary Proprietary Source path(if open source)

Brief description Datera is a cloud data

management platform designed for data centers. It enables users to archive data and manage lifecycle flow policies automatically, replicate and transfer data between sites and cloud-based on access patterns and data policies automatically, consolidate files generated across dispersed environments, streamline the dynamic allocation of resources, optimize the use of physical server infrastructure, mix and match generations of servers within the clusters, and mitigate the risk of a media firmware endemic bug. Features include cloud-based analytics portal, self-service portal, snapshot management, copy2cloud, and lightweight directory access protocol (LDAP) integration At TechFieldDay 18 the Datera company give an awesome presentation.

Project summary Project details Key features

Datera Orchestration: for Datera this means that data can be moved dynamically across all of the resources without impact on the application. On the storage level.  Enterprise performance: Delivering deduplication, compression and encryption as well as other storage services can introduce a big performance impact, Datera has some intellectual property that enables their customers to have all of these storage services and still have enterprise* performance. Ready Choice: This would be the ability to adopt to new technology. Datera promises the ability to adopt these new technologies is not only available for the new workloads, but the legacy workloads will also benefit from this. Data Center Awareness: Datera encourages their customers to implement their storage in a distributed version across the racks. This provides their customers the possibility to provide better fault resiliency as well as getting the data closer to the application. Predictive Operations: By constantly collecting telemetry information from the running workloads, Datera can monitor and predict the behavior of the workloads, making sure a custoamer can utilize the storage to the fullest. Architecture A data architecture describes how data is managed--from collection through to transformation, distribution, and consumption. It sets the blueprint for data and the way it flows through data storage systems. It is foundational to data processing operations and artificial intelligence (AI) applications.

A good data architecture ensures that data is manageable and useful, supporting data lifecycle management. More specifically, it can avoid redundant data storage, improve data quality through cleansing and deduplication, and enable new applications. Modern data architectures also provide mechanisms to integrate data across domains, such as between departments or geographies, breaking down data silos without the huge complexity that comes with storing everything in one place. Modern data architectures often leverage cloud platforms to manage and process data. While it can be more costly, its compute scalability enables important data processing tasks to be completed rapidly. The storage scalability also helps to cope with rising data volumes, and to ensure all relevant data is available to improve the quality of training AI applications. Reducing redundancy: There may be overlapping data fields across different sources, resulting in the risk of inconsistency, data inaccuracies, and missed opportunities for data integration. A good data architecture can

standardize how data is stored, and potentially reduce duplication, enabling better quality and holistic analyses. Improving data quality: Well-designed data architectures can solve some of the challenges of poorly managed data lakes, also known as “data swamps”. A data swamp lacks in appropriate data quality and data governance practices to provide insightful learnings. Data architectures can help enforce data governance and data security standards, enabling the appropriate oversight into data pipeline to operate as intended. By improving data quality and governance, data architectures can ensure that data is stored in a way that makes it useful now and in the future. Current usage Datera software deploys on industry-standard servers from Dell EMC, Fujitsu, Hewlett Packard Enterprise, Intel, Lenovo, Supermicro, and QUANTA to store blocks and objects in on-premises data centers, and private cloud and hybrid cloud environments Dell EMC Dell EMC (EMC Corporation until 2016) is an American multinational corporation headquartered in Hopkinton, Massachusetts and Round Rock, Texas, United States. [2] Dell EMC sells data storage, information security, virtualization, analytics, cloud computing and other products and services that enable organizations to store, manage, protect, and analyze data. Dell EMC's target markets include large companies and small- and medium-sized businesses across various vertical markets. [3][4] The company's stock (as EMC Corporation) was added to the New York Stock Exchange on April 6, 1986, [5] and was also listed on the S&P 500 index. EMC was acquired by Dell in 2016; at that time, Forbes noted EMC's "focus on developing and selling data storage and data management hardware and software and convincing its

customers to buy its products independent of their other IT buying decisions" based on "best-of-breed." [6] It was later renamed to Dell EMC. Dell uses the EMC name with some of its products. [7]  Fujitsu Fujitsu Limited is a Japanese multinational information and communications technology equipment and services corporation, established in 1935 and headquartered in Tokyo. [3] Fujitsu is the world's sixth-largest IT services provider by annual revenue, and the largest in Japan, in 2021. [4] The hardware offerings from Fujitsu are mainly of personal and enterprise computing products, including x86, SPARC and mainframe compatible server products, although the corporation and its subsidiaries also offer a diversity of products and services in the areas of data storage, telecommunications, advanced microelectronics, and air conditioning. It has approximately 126,400 employees and its products and services are available in approximately 180 countries. [2] Fujitsu is listed on the Tokyo Stock Exchange and Nagoya Stock Exchange; its Tokyo listing is a constituent of the Nikkei 225 and TOPIX 100 indices  Hewlett. The Hewlett Packard Enterprise Company (HPE) is an American multinational information technology company based in Spring, Texas, United States. HPE was founded on November 1, 2015, in Palo Alto, California, as part of the splitting of the Hewlett-Packard company. [2] It is a business-focused organization which works in servers, storage, networking, containerization software and consulting and support. The split was structured so that the former Hewlett-Packard Company would change its name to HP Inc. and spin off Hewlett Packard Enterprise as a newly created company. HP Inc. retained the old HP's personal computer and printing business, as well as

its stock-price history and original NYSE ticker symbol for Hewlett-Packard; Enterprise trades under its own ticker symbol: HPE. At the time of the spin-off, HPE's revenue was slightly less than that of HP Inc. [3] In 2017, HPE spun off its Enterprise Services business and merged it with Computer Sciences Corporation to become DXC Technology. Also in 2017, it spun off its software business segment and merged it with Micro Focus. [4] Intel Corporation is an American multinational corporation and technology company headquartered in Santa Clara, California. It is the world's largest semiconductor chip manufacturer by revenue, and is one of the developers of the x86 series of instruction sets, the instruction sets found in most personal computers (PCs). Incorporated in Delaware, [5] Intel ranked No. 45 in the 2020 Fortune 500 list of the largest United States corporations by total revenue for nearly a decade, from 2007 to 2016 fiscal years. [6]  Intel Intel supplies microprocessors for computer system manufacturers such as Acer, Lenovo, HP, and Dell. Intel also manufactures motherboard chipsets, network interface controllers and integrated circuits, flash memory, graphics chips, embedded processors and other devices related to communications and computing. Intel (integrated and electronics) was founded on July 18, 1968, by semiconductor pioneers Gordon Moore (of Moore's law) and Robert Noyce (1927–1990), and is associated with the executive leadership and vision of Andrew Grove. Intel was a key component of the rise of Silicon Valley as a high-tech center. Noyce was a key inventor of the integrated circuit (microchip). Intel was an early developer of SRAM and DRAM memory chips, which represented the majority of its business until 1981. Although Intel created the

world's first commercial microprocessor chip in 1971, it was not until the success of the personal computer (PC) that this became its primary business. Technical details Datera continuously monitors how the cluster is performing relative to the specified application intent, i.e. compares admin_state and operation_state. Application requirements in the form of policies are specified by the application admin, and the control plane works to apply them constantly to a completely programmable data plane based on the availability of physical resources. A policy change to improve performance of a subset of data would involve that data migrating to a node supporting media-types to better fit the policy autonomously with absolute transparency. Software on the individual nodes, built from commodity infrastructure, utilize resources-specific capabilities depending on the type of storage, CPU, memory and networking Transformation — protection, compression, encryption, duplication…

Additional information

Datera Embraces Change and Storage Autonomy

Datera was designed with one single mantra in mind “The only Constant is Change”. Software on the individual nodes, built from commodity infrastructure, utilize resources-specific capabilities depending on the type of storage, CPU, memory and networking that optimization

The autonomous characteristics of Datera Storage systems include

 Recovery: A Datera system will autonomously recover and adjust data in a way to meet the policy intent during failure and restoration of a variety of physical and software components.

 Policy Changes: Policies can be changed on the fly and the system will autonomously adjust data placement in an

entirely transparent and non-disruptive manner to configure the data plane to meet the policy intent.  Autonomous Redistribution: Datera allows creation of application intent to be created via AppInstance, even if the capabilities are not currently available on the cluster. When resources such as new storage media, memory are added, as part of closed loop autonomous optimization, the data will be redistributed in a non-disruptive manner to meet intent. Datera allows admins to decide the end-goal and the system strives to meet the goal when resources are made available. Data Placement: Datera provides an outcome based data placement mapping driven by application intent. Rolling-Upgrades: When a new software version is available, the cluster will autonomously provide the updaates

ARICHTECTURE OF DIAGRAM Datera and the Rise of Enterprise Software-Defined Storage | PenguinPunk.net