These are public posts tagged with #matrixmultiplication. You can interact with them if you have an account anywhere in the fediverse.
A Novel Compiler Transformation for Fast Sparse Matrix Multiplication in GPUs
Sparse data structures are commonly used in neural…
hgpu.org“Beating NumPy’s Matrix Multiplication In 150 Lines Of C Code”, Aman Salykov (https://salykova.github.io/matmul-cpu).
Via HN: https://news.ycombinator.com/item?id=40870345
On Lobsters: https://lobste.rs/s/6cktqx/beating_numpy_s_matrix_multiplication
#C #MatrixMultiplication #Math #Performance #BLAS #LinearAlgebra #MatMul #Speed #NumPy #Optimization
This blog post explains how to optimize multi-threaded…
salykovaResearchers upend AI status quo by eliminating matrix multiplication in LLMs - Enlarge / Illustration of a brain inside of a light bulb. (credit: Gett... - https://arstechnica.com/?p=2033314 #matrixmultiplication #machinelearning #googlegemini #ucsantacruz #matrixmath #chatgpt #ternary #biz #matmul #gpu #ai
Running AI models without matrix math means far less…
Ars TechnicaHow much can we gain from Tensor Kernel Fusion on GPUs?
Kernel fusion is a crucial optimization technique for…
hgpu.orgFast and Practical Strassen’s Matrix Multiplication using FPGAs
#OpenCL #FPGA #MatrixMultiplication #BLAS #LinearAlgebra #GEMM #Package
Matrix multiplication is a cornerstone operation in…
hgpu.orgEvaluation of computational and energy performance in matrix multiplication algorithms on CPU and GPU using MKL, cuBLAS and SYCL
#CUDA #SYCL #MKL #CUBLAS #MatrixMultiplication #LinearAlgebra #Performance #Package
Matrix multiplication is fundamental in the backpropagation…
hgpu.orgNew Breakthrough Brings #MatrixMultiplication Closer to Ideal https://www.quantamagazine.org/new-breakthrough-brings-matrix-multiplication-closer-to-ideal-20240307/
By eliminating a hidden inefficiency, computer scientists…
Quanta MagazineBy eliminating a hidden inefficiency, computer scientists Ran Duan, Renfei Zhou and Hongxun Wu have come up with a new way to multiply large matrices that’s faster than ever
By eliminating a hidden inefficiency, computer scientists…
Quanta MagazineMatrix multiplication breakthrough could lead to faster, more efficient AI models - Enlarge / When you do math on a computer, you fly through a numerical t... - https://arstechnica.com/?p=2008905 #massachusettsinstituteoftechnology #matrixmultiplication #aiandtheenvironment #tsinghuauniversity #machinelearning #williamkuszmaul #quantamagazine #volkerstrassen #aiefficiency #lasermethod #openaisora #chatgpt #chatgtp #biz #zhou #ai
At the heart of AI, matrix math has just seen its biggest…
Ars Technica#DailyBloggingChallenge (137/200)
When looking at short term memory, a portion of it is the active working memory.
One way to look at the active working memory is like matrix multiplication.
Let's say one gets a 5 digit number. It is fairly simple to return that number in the same order as given, which would equate to the identity matrix.
If one had to invert the order of number, then the diagonal of the matrix would be flipped compared to the identity matrix, so equating to a negative determinant.
Now imagine if one had to do more complex calculations like ordering the months of the year alphabetically. Creating a matrix that transposes the months vector is initially not that simple, though once instantiated, it becomes quite simple to repeat.
Depending on how many row alterations are needed, starting from the identity matrix could be a way to quantify the complexity of the task.
#ShortTermMemory #ActiveWorkingMemory #memory #matrix #MatrixMultiplication #maths
#AI Reveals New Possibilities in #MatrixMultiplication.
Last month, a team at the artificial intelligence company #DeepMind showed how to tackle the problem from a new direction, reporting in a paper in Nature that they’d successfully trained a neural network to discover new fast algorithms for matrix multiplication. It was as if the #AI had found an unknown strategy for solving a monstrously complex Rubik’s Cube.
“It’s a very neat result,” said #JoshAlman, a computer scientist at #ColumbiaUniversity. But he and other matrix multiplication specialists also emphasized that such #AI assistance will complement rather than replace existing methods — at least in the near term. “It’s like a proof of concept for something that could become a breakthrough,” Alman said. The result will simply help researchers on their quest.
Inspired by the results of a game-playing neural network,…
www.quantamagazine.orgWhen ever I see a cool new result in #ComputationalMath, I like to see if I can replicate it. So, last month when that Nature article came out about #MatrixMultiplication formulas from #AlphaTensor I set out see if I could get their formulas and verify them symbolically.
I was able to do that and of course they were right. But I was excited to see Kauers and Moosbauer publish a response a couple days later. So, here's their results replicated in a Maple Jupyter notebook https://github.com/johnpmay/MapleSnippets/blob/main/KMtoFFM.ipynb
Snippets of interesting Maple code. Contribute to johnpmay/MapleSnippets…
github.comDeepMind breaks 50-year math record using AI; new record falls a week later
https://arstechnica.com/?p=1887641
#matrixmultiplication #machinelearning #AlpheTensor #AlphaZero #deepmind #alphago #Biz&IT #AI
AlphaTensor discovers better algorithms for matrix…
arstechnica.com