vault backup: 2023-05-22 17:32:00

Affected files:
.obsidian/community-plugins.json
.obsidian/graph.json
.obsidian/plugins/table-editor-obsidian/data.json
.obsidian/plugins/table-editor-obsidian/main.js
.obsidian/plugins/table-editor-obsidian/manifest.json
.obsidian/plugins/table-editor-obsidian/styles.css
.obsidian/workspace.json
Charities.md
Health/BWS.md
History/Britain.md
Lab/DNS.md
Lab/Deleted Packages.md
Lab/Ebook Laundering.md
Lab/Home.md
Lab/Mac.md
Lab/Photo Migration.md
Languages/Arabic.md
Money/Assets/Derivative.md
Money/Assets/Financial Instruments.md
Money/Assets/Security.md
Money/Econ.md
Money/Equity.md
Money/Giving.md
Money/Markets/Commodity.md
Money/Markets/Markets.md
Money/Markets/Types.md
STEM/AI/Literature.md
STEM/AI/Properties.md
STEM/CS/ABI.md
STEM/CS/Code Types.md
STEM/CS/Compilers.md
STEM/CS/Language Binding.md
STEM/CS/Languages/dotNet.md
STEM/CS/Quantum.md
STEM/CS/Resources.md
STEM/CS/Turing Machines.md
STEM/Maths/Algebra.md
STEM/Semiconductors/Equations.md
STEM/Signal Proc/Convolution.md
STEM/Signal Proc/Fourier Transform.md
STEM/Speech/Literature.md
STEM/img/ai-io.png
STEM/img/ai-nested-subjects.png
STEM/img/cli-infrastructure.png
Tattoo/Plans.md
Tattoo/img/chest.png
This commit is contained in:
andy 2023-05-22 17:32:00 +01:00
parent 2874b7c524
commit be05b7905d
18 changed files with 235 additions and 25 deletions

View File

@ -1,3 +1,4 @@
#lit
[https://web.stanford.edu/~jurafsky/slp3/A.pdf](https://web.stanford.edu/~jurafsky/slp3/A.pdf) [https://web.stanford.edu/~jurafsky/slp3/A.pdf](https://web.stanford.edu/~jurafsky/slp3/A.pdf)
[Towards Data Science: 3 Things You Need To Know Before You Train-Test Split](https://towardsdatascience.com/3-things-you-need-to-know-before-you-train-test-split-869dfabb7e50) [Towards Data Science: 3 Things You Need To Know Before You Train-Test Split](https://towardsdatascience.com/3-things-you-need-to-know-before-you-train-test-split-869dfabb7e50)
[https://machinelearningmastery.com/train-final-machine-learning-model/](https://machinelearningmastery.com/train-final-machine-learning-model/) [https://machinelearningmastery.com/train-final-machine-learning-model/](https://machinelearningmastery.com/train-final-machine-learning-model/)

59
AI/Properties.md Normal file
View File

@ -0,0 +1,59 @@
# Three Key Components
1. Representation
- Declarative & Procedural knowledge
- Typically human-readable symbols
2. Reasoning
- Ability to solve problems
- Express and solve range of problems and types
- Make explicit and implicit information known to it
- Control mechanism to decide which operations to use if and when, when a solution has been found
3. Learning
An AI system must be able to
1. Store knowledge
2. Apply knowledge to solve problems
3. Acquire new knowledge through experience
![[ai-nested-subjects.png]]
# Expert Systems
- Usually easier to obtain compiled experience from experts than duplicate experience that made them experts for network
# Information Processing
## Inductive
- General patterns and rules determined from data and experience
- Similarity-based learning
## Deductive
- General rules are used to determine specific facts
- Proof of a theorem
Explanation-based learning uses both
# Classical AI vs Neural Nets
## Level of Explanation
- Classical has emphasis on building symbolic representations
- Models cognition as sequential processing of symbolic representations
- Neural nets emphasis on parallel distributed processing models
- Models assume information processing takes place through interactions of large numbers of neurons
## Processing style
- Classical processing is sequential
- Von Neumann Machine
- Neural nets use parallelism everywhere
- Source of flexibility
- Robust
## Representational Structure
- Classical emphasises language of thought
- Symbolic representation has quasi-linguistic structure
- New symbols created from compositionality
- Neural nets have problem describing nature and structure of representation
Symbolic AI is the formal manipulation of a language of algorithms and data representations in a top-down fashion
Neural nets bottom-up
![[ai-io.png]]

View File

@ -1,8 +1,8 @@
- How data structures & computational routines are accessed in machine code - How data structures & computational routines are accessed in machine code ([[Code Types]])
- Machine code therefore hardware-dependent - Machine code therefore hardware-dependent
- API defines this structure in source code - API defines this structure in source code
- Adherence usually responsibility of - Adherence usually responsibility of
- Compiler - [[Compilers]]
- OS - OS
- Library author - Library author
@ -13,7 +13,7 @@
- Stack organisation - Stack organisation
- Memory access types - Memory access types
- Size, layouts and alignments of basic data types - Size, layouts and alignments of basic data types
- ___Calling convention___ - [[Calling Conventions]]
- How function arguments are passed - How function arguments are passed
- Stack or register - Stack or register
- Which registers for which function param - Which registers for which function param

View File

@ -27,9 +27,9 @@ Portable Code
- Compact numeric codes, constants and references - Compact numeric codes, constants and references
- Encode compiler output following analysis and validation - Encode compiler output following analysis and validation
- Can be further compiled - Can be further compiled
- JIT - [[Compilers#JIT]]
- Typically passed to VM - Typically passed to VM
- Java, Python - Java, [[Python]]
## Object Code ## Object Code
- Product of compiler - Product of compiler

View File

@ -13,6 +13,7 @@ Just-in-Time
- Adaptive optimization - Adaptive optimization
- Dynamic recompilation - Dynamic recompilation
- Microarchitecture-specific speedups - Microarchitecture-specific speedups
- [[ISA]]
## AOT ## AOT
Ahead-of-Time Ahead-of-Time

View File

@ -4,22 +4,25 @@
## Runtime Environments ## Runtime Environments
### Object Models ### Object Models
- COM - COM
- Component Object Model - [[C++]]
- MS only cross-language model - Component Object Model
- CLI - MS only cross-language model
- .NET Common Language Infrastructure - CLI
- Freedesktop.org D-Bus - [[dotNet]]
- Open cross-platform-language model - .NET Common Language Infrastructure
- Freedesktop.org D-Bus
- Open cross-platform-language model
### Virtual Machines ### Virtual Machines
- CLR - CLR
- .NET Common Language Runtime - [[dotNet]]
- Mono - .NET Common Language Runtime
- CLI languages - Mono
- Cross-platform - CLI languages
- Adobe Flash Player - Cross-platform
- Tamarin - Adobe Flash Player
- JVM - Tamarin
- LLVM - JVM
- Silverlight - LLVM
- Silverlight

View File

@ -10,6 +10,7 @@
- JIT managed code into machine instructions - JIT managed code into machine instructions
- Execution engine - Execution engine
- VM - VM
- [[Language Binding#Virtual Machines]]
- Services - Services
- Memory management - Memory management
- Type safety - Type safety
@ -28,3 +29,5 @@
- Compiled CLI code - Compiled CLI code
- Portable executable (PE) - Portable executable (PE)
- DLL, EXE - DLL, EXE
![[cli-infrastructure.png]]

View File

@ -1 +1,2 @@
#lit
[5 books](https://fivebooks.com/best-books/quantum-computing-chris-bernhardt/) [5 books](https://fivebooks.com/best-books/quantum-computing-chris-bernhardt/)

View File

@ -1 +1,2 @@
#lit
[Wigle - wifi enumerating](http://wigle.net) [Wigle - wifi enumerating](http://wigle.net)

16
CS/Turing Machines.md Normal file
View File

@ -0,0 +1,16 @@
# David Hilbert
- Wondered if there was a universal algorithmic process to decide whether any mathematical proposition was true
- Then suggested that there were no unsolvable problems
# Incompleteness Theorem
## Kurt Godel
You might be able to prove every conceivable statement about numbers within a system by going outside the system in order to come up with new rules and axioms, but by doing so you'll only create a larger system with its own unprovable statements
# Turing Machine
- Model of computation
- Resolves whether or not mathematics contained problems were incomputable
- No algorithmic solution
### Church-Turing Thesis
Any algorithm capable of being devised can be run on a Turing machine

18
Maths/Algebra.md Normal file
View File

@ -0,0 +1,18 @@
# Field
- Set on which addition and multiplication defined
- Behave same as on rational and real numbers
- Subtraction, division implied
- Examples
- Rational numbers
- Real numbers
- Complex numbers
- Any field can be used as scalars for a vector space
- A commutative ring where 0 =/= 1 and all nonzero elements are invertible
## Vector Space
- Set of vectors
- Can be added together and multiplied by scalar
- Can be scaled by complex numbers
- Part of definitions
- Must satisfy vector axioms

View File

@ -11,7 +11,7 @@ $$J=\sigma E$$
$$V_{bi} = \frac{kT}{q}ln(\frac{N_D N_A}{n_i^2})$$ $$V_{bi} = \frac{kT}{q}ln(\frac{N_D N_A}{n_i^2})$$
- $V_{bi}$ = Built-in Potential - $V_{bi}$ = Built-in Potential
[[Doping]]
$$J=nev$$ $$J=nev$$
- $n$ = Charge Density - $n$ = Charge Density
- $e$ = Charge - $e$ = Charge

View File

@ -0,0 +1,26 @@
Integral operator
- Satisfies mathematical properties of integral operator
- Product of two after one has been reversed and shifted
$$x(t)=x_1(t)\circledast x_2(t)=\int_{-\infty}^\infty x_1(t-\tau)\cdot x_2(\tau)d\tau$$
# Properties
1. $x_1(t)\circledast x_2(t)=x_2(t)\circledast x_1(t)$
1. Commutativity
2. $(x_1(t)\circledast x_2(t))\circledast x_3(t)=x_1(t)\circledast (x_2(t)\circledast x_3(t))$
1. Associativity
3. $x_1(t)\circledast [x_2(t)+x_3(t)]=x_1(t)\circledast x_2(t)+ x_1(t)\circledast x_3(t)$
1. Distributivity
4. $Ax_1(t)\circledast Bx_2(t)=AB[x_1(t)\circledast x_2(t)]$
1. Associativity with Scalar
5. Symmetrical graph about origin
# Applications
1. Communications systems
- Shift signal in frequency domain (Frequency modulation)
2. System analysis
- Find system output given input and transfer function
# Polynomial Multiplication
- Convolving coefficients of two poly gives coefficients of product

View File

@ -0,0 +1,66 @@
$$X(\omega)=\int_{-\infty}^{\infty}x(t)e^{-j\omega t}dt$$
$$x(t)=\frac{1}{2\pi}\int_{2\pi}X(\omega)e^{j\omega t}d\omega$$
## Discrete-Time
$$X(\omega)=\sum_{-\infty}^{\infty}x[n]e^{-j\omega n}$$
$$x[n]=\frac{1}{2\pi}\int_{2\pi}X(\omega)e^{j\omega n}d\omega$$
## Discrete Fourier Transform
Digital Signal
$$X[k]=\sum_{n=0}^{N-1}x[n]e^{-j\omega_{k}n}$$
$$x[n]=\frac{1}{N}\sum_{k=0}^{N-1}X[k]e^{j\omega_{k}n}, n=0,1,\ldots,N-1$$
## Power Spectral Density
PSD
$$P[k]=|X[k]|^2$$
## Spectrogram
- PSD vertically
- Frequency power over time horizontally
- ___Time and frequency resolution inversely proportional___
- Resolution
- Frequency
- $fs/N$
- Time
- $N/fs$
- STFT has fixed resolution depending on window size
- Wider window
- Better frequency res
- Worse time resolution
- Can't tell where stuff changes with big window
- Can't use too wide
- Frequency can change during window
- 20-30ms window of speech usually treated as quasi-stationary
- Overlapping window
- Hop size of 5ms
- Appending windows can cause discontinuities
- Use window function to smooth
- Hann
## Fast-Fourier
FFT
- Faster version of DFT
- Three parts
- Shuffling
- Bit reversal
- Shuffle N-dimensional input into N one-dimensional signals
- N one-point DFTs
- Merge
- N one-point DFTs into one N-point DFT
- Butterfly merging equations
## Short-Time Fourier Transform
STFT
- Short-term
- N-point windowed DFT
- Probably use FFT
$$x[k,m]=\sum_{n=0}^{N-1}x[m\delta+n]w(n)e^{-j\omega_kn}$$
- $\omega$
- Discrete angular frequency
- $m$
- Time-frame index
- $\delta$
- Hop size
- $w(n)$
- Window function
- Hann

15
Speech/Literature.md Normal file
View File

@ -0,0 +1,15 @@
#lit
Daniel Jurafsky
James H. Martin
[Speech and Language Processing - 3rd Ed. Draft](https://web.stanford.edu/~jurafsky/slp3/ed3book.pdf)[Hidden Markov Models](https://web.stanford.edu/~jurafsky/slp3/A.pdf)
# Coursework
- [Stack Overflow, Spectrogram Matlab Explanation](https://stackoverflow.com/questions/29321696/what-is-a-spectrogram-and-how-do-i-set-its-parameters)
- [Matlab - LPC Analysis and Synthesis of Speech](https://uk.mathworks.com/help/dsp/ug/lpc-analysis-and-synthesis-of-speech.html)
- [Matlab - Formant Estimation with LPC Coefficients](https://uk.mathworks.com/help/signal/ug/formant-estimation-with-lpc-coefficients.html)
- [Matlab - Linear Prediction and Autoregressive Modeling](https://uk.mathworks.com/help/signal/ug/linear-prediction-and-autoregressive-modeling.html)
- [Quefrency Paper](https://www.researchgate.net/publication/3321562_From_Frequency_to_Quefrency_A_History_of_the_Cepstrum)
- [Aalto Uni - Pre-emphasis](https://wiki.aalto.fi/display/ITSP/Pre-emphasis)
- [Preemphasis paper](https://mini.dcs.shef.ac.uk/wp-content/papercite-data/pdf/loweimi_nolisp13.pdf)
- [Quora - Preemphasis](https://www.quora.com/Why-is-pre-emphasis-i-e-passing-the-speech-signal-through-a-first-order-high-pass-filter-required-in-speech-processing-and-how-does-it-work)

BIN
img/ai-io.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

BIN
img/ai-nested-subjects.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

BIN
img/cli-infrastructure.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 77 KiB