Warning

 

Close

Confirm Action

Are you sure you wish to do this?

Confirm Cancel
BCM
User Panel

Site Notices
Posted: 12/7/2018 5:53:46 PM EDT
@ar-jedi
@Zhukov

Which is better for a bored software dev / network engineer / EE dropout to learn?
Link Posted: 12/8/2018 3:30:53 AM EDT
[#1]
Quoted:
Which is better for a bored software dev / network engineer / EE dropout to learn?
View Quote

neither.

well, first i have a couple of questions...

where are you going with this?  academic interest?  career change?  etc
are you trying to make an LED blink on an eval board, or design a high speed ASIC?
does it have to work?  

20 years ago, one might say Verilog was the "west coast" HDL preference, and VHDL was the "east coast and europe" HDL preference.
so if you wanted a job in a certain geographic area, you should learn to speak their language as a starting point.

nowadays, there is much more mixing and internal reuse and commercially available IP blocks and so on, so you may need to be more or less bilingual anyway.
no one writes all the logic for an FPGA or ASIC; you download/license/buy library cores for commodity functions.  (example)
some of these cores will be in the one HDL or the other.  modern EDA systems are well adapted to this.  but...

THAT SAID, first realize that a HUGE / MAJOR / CRITICAL part of logic design is verification.  
and with increasing project complexity comes geometrically increasing verification complexity.

the hardest part of CPLD/FPGA/ASIC design often is not writing the functional code -- it's validating the code for correctness and timing and so on.
test coverage becomes your mission: making sure there are no untested -- and worse, untestable -- logic, synchronization, and other blocks.  
and once that code is on target (e.g. pumped on an FPGA, for example) additional tools which interact with the EDA environment are used.

incidentally, in some complex development environments, the primary authors of the code base will not be generating the test bench; instead another set of smart people will implement an overall formal verification process -- translating the system and device requirements into actual test harnesses, and taking an adversarial approach to the validation.

for this reason, System Verilog, which incorporates more aspects of test bench generation and therefore verification, would be my suggested route.
modern complex projects are often hybrid: the top level of the design is done using System Verilog, but aspects of the logic may be imported Verilog and VHDL.

read for a bit:
https://en.wikipedia.org/wiki/SystemVerilog
and
http://www.asic-world.com/systemverilog/intro.html
and
https://www.doulos.com/knowhow/sysverilog/whatissv/

hence if you were going to start down a path leading to a career, learning Verilog is probably the right first step since you can migrate easily into System Verilog, which is where the complex jobs are done.
if you are going to play around with a FPGA development board and such, learn whatever HDL the example code for the dev board was written in.  the concepts in VHDL and Verilog are the same; the structure and syntax differs.
Link Posted: 12/8/2018 9:27:56 AM EDT
[#2]
Thank you! I knew I could rely on you for an informative and well thought out reply.

Discussion ForumsJump to Quoted PostQuote History
Quoted:

well, first i have a couple of questions...

where are you going with this?  academic interest?  career change?  etc
are you trying to make an LED blink on an eval board, or design a high speed ASIC?
does it have to work?  
View Quote
Partly academic interest. I have to keep learning new things at a deep level or else I suffer from terminal boredom. For the next six months, I have targeted ARM (STM32 specifically), FPGA, and SoC.

Yes, making an LED blink on an eval board will likely be a first step.

Partly possible career change (however, I might be too old for that).
Link Posted: 12/8/2018 1:06:08 PM EDT
[#3]
Discussion ForumsJump to Quoted PostQuote History
Quoted:
For the next six months, I have targeted ARM (STM32 specifically), FPGA, and SoC.
View Quote View All Quotes
View All Quotes
Discussion ForumsJump to Quoted PostQuote History
Quoted:
For the next six months, I have targeted ARM (STM32 specifically), FPGA, and SoC.
the de facto approach to learning this is to get (or get access to) a dev board with a Xilinz Zynq device, or a dev board with an Altera/Intel SoC device.  in these types of devices, there is a single- or multicore ARM processor embedded within the FPGA fabric.  you can, with very little FPGA coding, boot the processor and get to a uboot and eventually a linux prompt.  with more coding (and/or integration of library cores), synthesized peripherals can be attached to the core via a couple of means.  so, PCIe root/target, serial uarts, SPI/i2c controllers, USB host/endpoint, etc etc etc are some of the myriad possible "soft" peripherals that can be dynamically instantiated in the FPGA fabric, so when the ARM processor is taken out of reset the peripherals are magically there.  (think The Matrix).   this is what makes the FPGA-based SoC concept so compelling: your next version of the product needs another feature?  don't redesign the board or change the processor architecture, just pull the logic in and recompile the HDL.   you can even field upgrade your current products this way; a new FPGA load can add significant new features, even features that we would normally associate with hardware changes.

https://www.xilinx.com/products/silicon-devices/soc.html
https://www.intel.com/content/www/us/en/products/programmable/soc.html

Quoted:
Yes, making an LED blink on an eval board will likely be a first step.
getting your development environment set up and learning how to pump and debug the FPGA is the biggest challenge; the logic to make the LED blink at 1Hz is truly the easy part.  
getting good at FPGA development is not coding per se; it's verification and ability to use in-situ tools such as Signal Tap (Altera, now Intel), ILA (Xilinx), and others.  in most cases these are cores that you build into your design that let you do all sorts of realtime on-target analysis.

https://www.intel.com/content/www/us/en/programmable/support/training/course/odsw1164.html
https://www.xilinx.com/products/intellectual-property/ila.html
Link Posted: 12/8/2018 4:26:58 PM EDT
[#4]
Link Posted: 12/9/2018 8:55:28 PM EDT
[#5]
Discussion ForumsJump to Quoted PostQuote History
Quoted:

the de facto approach to learning this is to get (or get access to) a dev board with a Xilinz Zynq device, or a dev board with an Altera/Intel SoC device.  in these types of devices, there is a single- or multicore ARM processor embedded within the FPGA fabric.  you can, with very little FPGA coding, boot the processor and get to a uboot and eventually a linux prompt.
View Quote
So something like: https://store.digilentinc.com/cora-z7-zynq-7000-single-core-and-dual-core-options-for-arm-fpga-soc-development/

Or: https://store.digilentinc.com/zybo-z7-zynq-7000-arm-fpga-soc-development-board/

Or are we talking about the $3k - $5k boards?
Close Join Our Mail List to Stay Up To Date! Win a FREE Membership!

Sign up for the ARFCOM weekly newsletter and be entered to win a free ARFCOM membership. One new winner* is announced every week!

You will receive an email every Friday morning featuring the latest chatter from the hottest topics, breaking news surrounding legislation, as well as exclusive deals only available to ARFCOM email subscribers.


By signing up you agree to our User Agreement. *Must have a registered ARFCOM account to win.
Top Top