If you did not already know

Domain Knowledge-driven Methodology (DoKnowMe) Software engineering considers performance evaluation to be one of the key portions of software quality assurance. Unfortunately, there seems to be a lack of standard methodologies for performance evaluation even in the scope of experimental computer science. Inspired by the concept of ‘instantiation’ in object-oriented programming, we distinguish the generic performance evaluation logic from the distributed and ad-hoc relevant studies, and develop an abstract evaluation methodology (by analogy of ‘class’) we name Domain Knowledge-driven Methodology (DoKnowMe). By replacing five predefined domain-specific knowledge artefacts, DoKnowMe could be instantiated into specific methodologies (by analogy of ‘object’) to guide evaluators in performance evaluation of different software and even computing systems. We also propose a generic validation framework with four indicators (i.e.~usefulness, feasibility, effectiveness and repeatability), and use it to validate DoKnowMe in the Cloud services evaluation domain. Given the positive and promising validation result, we plan to integrate more common evaluation strategies to improve DoKnowMe and further focus on the performance evaluation of Cloud autoscaler systems. …

Butterfly-Net Deep networks, especially Convolutional Neural Networks (CNNs), have been successfully applied in various areas of machine learning as well as to challenging problems in other scientific and engineering fields. This paper introduces Butterfly-Net, a low-complexity CNN with structured hard-coded weights and sparse across-channel connections, which aims at an optimal hierarchical function representation of the input signal. Theoretical analysis of the approximation power of Butterfly-Net to the Fourier representation of input data shows that the error decays exponentially as the depth increases. Due to the ability of Butterfly-Net to approximate Fourier and local Fourier transforms, the result can be used for approximation upper bound for CNNs in a large class of problems. The analysis results are validated in numerical experiments on the approximation of a 1D Fourier kernel and of solving a 2D Poisson’s equation. …

eDiscovery Electronic discovery (or ‘eDiscovery’) is the process of identifying, preserving, collecting, analyzing, reviewing, and producing electronically stored information (ESI). Structured and unstructured data analysis is at the core of eDiscovery. Even routine matters regularly involve hundreds of gigabytes of data that much be analyzed for relevancy and privilege. Most often undertaken for litigation and regulatory compliance, eDiscovery processes and the underlying data mining technology are also deployed for internal investigations, due diligence, financial contract analysis, privacy impact assessments (including GDPR), and data breach responses. Undoubtedly, eDiscovery efforts are crucial to ongoing success in today’s modern corporation. For effective eDiscovery, enterprises need to be able to search through information across their entire enterprise, including both structured (e.g. databases) and unstructured data (e.g. emails, images), and effectively analyze content. The best eDiscovery software will integrate with existing systems and litigation-ready policies. It enables targeted data collections, sophisticated culling and de-duplication. In addition, the capabilities of the best eDiscovery software includes AI-enhanced analysis, full review and tagging, automated redactions, and DIY productions. …

Like this:

Like Loading…

Related