Home Forums

    • Forum
    • Topics
    • Posts
    • Freshness
    • Audio Processing/Analysis
      Audio analysis refers to the extraction of information and meaning from audio signals for analysis, classification, storage, retrieval, synthesis, etc. The observation mediums and interpretation methods vary, as audio analysis can refer to the human ear and how people interpret the audible sound source, or it could refer to using technology such as an Audio analyzer to evaluate other qualities of a sound source such as amplitudedistortionfrequency response, and more. Once an audio source's information has been observed, the information revealed can then be processed for the logical, emotional, descriptive, or otherwise relevant interpretation by the user. Discussions: Natural Analysis , Signal Analysis, etc.    
    • 0
    • 0
    • No Topics

    • Bayesian Statistics
      Bayesian statistics is a system for describing epistemological uncertainty using the mathematical language of probability. In the 'Bayesian paradigm,' degrees of belief in states of nature are specified; these are non-negative, and the total belief in all states of nature is fixed to be one.  
    • 0
    • 0
    • No Topics

    • Big Data Analytics
      "Big data" is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software.  
    • 2
    • 4
    • 2 months, 3 weeks ago

       Taufik Sutanto

    • Bioinformatics
      As an interdisciplinary field of science, bioinformatics combines biology, computer science, information engineering, mathematics and statistics to analyze and interpret biological data. Bioinformatics has been used for in silico analyses of biological queries using mathematical and statistical techniques.  

    • 0
    • 0
    • No Topics

    • Computer Vision
      Computer vision is a field of artificial intelligence that trains computers to interpret and understand the visual world. Using digital images from cameras and videos and deep learning models, machines can accurately identify and classify objects — and then react to what they “see.” Discussions: image processing, segmentation, opencv, pattern recognition, filtering, etc.
    • 0
    • 0
    • No Topics

    • Data Science (Machine Learning)
      Machine learning is a method of data analysis that automates analytical model building. It is a branch of artificial intelligence based on the idea that systems can learn from data, identify patterns and make decisions with minimal human intervention. Discussions: (General) Linier Models ~ Regression, Classification Models, Clustering, Semi-Supervised Clustering, Soft Clustering/Classification, Recommendation Models, Spatial Data Analysis, Time Series Analysis, ...  
    • 10
    • 15
    • 2 months ago

       Taufik Sutanto

    • Data Wrangling / Pre-Processing
      Data wrangling, sometimes referred to as data munging, is the process of transforming and mapping data from one "raw" data form into another format with the intent of making it more appropriate and valuable for a variety of downstream purposes such as analytics.
    • 0
    • 0
    • No Topics

    • Database
      A database is an organized collection of data, generally stored and accessed electronically from a computer system. Where databases are more complex they are often developed using formal design and modeling techniques. Discussions: SQL, NoSQL, & NewSQL, partitioning/sharding, replica, query/setting optimizations, etc.
    • 1
    • 1
    • 9 months, 3 weeks ago

       Taufik Sutanto

    • Deep Learning
      Deep learning (also known as deep structuredlearning or hierarchical learning) is part of a broader family of machine learning methods based on artificialneural networksLearning can be supervised, semi-supervised or unsupervised. Deep Learning Discussions: Reinforcement Learning, GAN, CNN, LSTM, Embedding, Transfer Learning, Regularization, Dropout, etc.  
    • 0
    • 0
    • No Topics

    • Graph Theory/Network Analysis
      In mathematics, graph theory is the study of graphs, which are mathematical structures used to model pairwise relations between objects. A graph in this context is made up of vertices (also called nodes or points) which are connected by edges (also called links or lines). Discussions: Social Media Analytics, Graph Theory, Tree, Graph Partitioning, Graph Communities, Centrality, Shortest Path, etc.
    • 0
    • 0
    • No Topics

    • Internet of Things
      The internet of things, or IoT, is a system of interrelated computing devices, mechanical and digital machines, objects, animals or people that are provided with unique identifiers and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. Discussions: Edge Computing, SBC, Sensors, smart city, home automation, AgriTech, etc.
    • 0
    • 0
    • No Topics

    • Mathematics
      A list of discussions of the mathematical background necessary to get up and running in data science practical/research work. Discussions: differential equations, Optimizations, Integral, Logics, Metric (topology), Linear Algebra, etc.  
    • 0
    • 0
    • No Topics

    • Metode Numerik
      Metode Numerik Forum ini adalah sarana diskusi terkait kuliah Metode Numerik di tau-data. Silahkan enroll terlebih dahulu ke kuliah tersebut (gratis) sebelum berdiskusi di forum ini.
      Associated Courses:
      Metode Numerik
    • 0
    • 0
    • No Topics

    • Others
      Pelatihan, Jualan, dan hal-hal lain yang tidak terkait langsung dengan Topik forum yang ada.
    • 0
    • 0
    • No Topics

    • Programming
      Programming is the process of creating a set of instructions that tell a computer how to perform a task. Programming can be done using a variety of computer "languages". Discussions: Algorithms, Python, R, Java, C++, PHP, D3.js, etc.
    • 0
    • 0
    • No Topics

    • Qualitative Analysis
      Qualitative analysis uses subjective judgment based on unquantifiable information. Qualitative analysis deals with intangible and inexact information that can be difficult to collect and measure. Machines struggle to conduct qualitative analysis as intangibles can't be defined by numeric values. Discussions: Coding, Sampling, Interview, FGD, Content Analysis, Narrative Analysis, Discourse Analysis, Framework Analysis, Grounded Theory, etc.  
    • 0
    • 0
    • No Topics

    • Statistics
      Statistical methods are a key part of of data science, yet very few data scientists have adequate proper understanding of statistics concepts. This would results in incorrect interpretations/decisions and other disaster in the real life situations. Discussions: p-value, correlations, metric/distance, descriptive statistics, hypothesis testing, distributions, evaluations, hybrid/ensemble methods, data transformation, kernel, and much more.
    • 0
    • 0
    • No Topics

    • Text Mining/Natural Language Processing (NLP)
      Text mining, also referred to as text data mining, roughly equivalent to text analytics, is the process of deriving high-quality information from text. High-quality information is typically derived through the devising of patterns and trends through means such as statistical pattern learning. Text mining usually involves the process of structuring the input text (usually parsing, along with the addition of some derived linguistic features and the removal of others, and subsequent insertion into a database), deriving patterns within the structured data, and finally evaluation and interpretation of the output. 'High quality' in text mining usually refers to some combination of relevancenovelty, and interest. Typical text mining tasks include text categorizationtext clustering, concept/entity extraction, production of granular taxonomies, sentiment analysisdocument summarization, and entity relation modeling (i.e., learning relations between named entities). Discussions: embedding, nlp, sentiment analysis, vsm, text preprocessing, encoding-decoding, etc.  
    • 0
    • 0
    • No Topics

    • Visualizations
      Data visualization is viewed by many disciplines as a modern equivalent of visual communication. It involves the creation and study of the visual representation of data.[1] To communicate information clearly and efficiently, data visualization uses statistical graphicsplotsinformation graphics and other tools. Numerical data may be encoded using dots, lines, or bars, to visually communicate a quantitative message.[2] Effective visualization helps users analyze and reason about data and evidence. It makes complex data more accessible, understandable and usable. Users may have particular analytical tasks, such as making comparisons or understanding causality, and the design principle of the graphic (i.e., showing comparisons or showing causality) follows the task. Tables are generally used where users will look up a specific measurement, while charts of various types are used to show patterns or relationships in the data for one or more variables. Discussions: Geometry, Distance, UMAP/t-SNE, pyplot/seaborn/matplotlib, MDS, unfolding/manifolds, etc.
    • 0
    • 0
    • No Topics